Dec 05 15:55:11 crc systemd[1]: Starting Kubernetes Kubelet... Dec 05 15:55:12 crc restorecon[4690]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 15:55:12 crc restorecon[4690]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 15:55:12 crc restorecon[4690]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 05 15:55:13 crc kubenswrapper[4778]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 15:55:13 crc kubenswrapper[4778]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 15:55:13 crc kubenswrapper[4778]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 15:55:13 crc kubenswrapper[4778]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 15:55:13 crc kubenswrapper[4778]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 15:55:13 crc kubenswrapper[4778]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.061719 4778 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069685 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069741 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069751 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069760 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069767 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069775 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069784 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069793 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069801 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069808 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069814 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069821 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069828 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069835 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069841 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069847 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069853 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069860 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069867 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069873 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069879 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069886 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069893 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069899 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069907 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069914 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069920 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069940 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069947 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069953 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069959 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069966 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069972 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.069978 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070002 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070011 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070018 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070025 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070031 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070039 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070047 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070053 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070061 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070067 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070073 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070078 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070085 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070091 4778 feature_gate.go:330] unrecognized feature gate: Example Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070097 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070103 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070110 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070116 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070126 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070134 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070141 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070148 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070154 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070161 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070167 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070173 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070180 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070187 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070194 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070200 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070210 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070220 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070232 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070242 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070251 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070260 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.070269 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070620 4778 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070650 4778 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070665 4778 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070676 4778 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070687 4778 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070695 4778 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070706 4778 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070715 4778 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070723 4778 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070730 4778 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070739 4778 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070750 4778 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070758 4778 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070766 4778 flags.go:64] FLAG: --cgroup-root="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070773 4778 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070781 4778 flags.go:64] FLAG: --client-ca-file="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070789 4778 flags.go:64] FLAG: --cloud-config="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070796 4778 flags.go:64] FLAG: --cloud-provider="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070804 4778 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070816 4778 flags.go:64] FLAG: --cluster-domain="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070822 4778 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070829 4778 flags.go:64] FLAG: --config-dir="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070835 4778 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070842 4778 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070851 4778 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070858 4778 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070864 4778 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070873 4778 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070880 4778 flags.go:64] FLAG: --contention-profiling="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070886 4778 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070892 4778 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070899 4778 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070905 4778 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070914 4778 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070920 4778 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070926 4778 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070933 4778 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070939 4778 flags.go:64] FLAG: --enable-server="true" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070945 4778 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070953 4778 flags.go:64] FLAG: --event-burst="100" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070960 4778 flags.go:64] FLAG: --event-qps="50" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070967 4778 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070973 4778 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070980 4778 flags.go:64] FLAG: --eviction-hard="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070988 4778 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.070994 4778 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071001 4778 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071008 4778 flags.go:64] FLAG: --eviction-soft="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071014 4778 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071020 4778 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071026 4778 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071032 4778 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071038 4778 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071045 4778 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071051 4778 flags.go:64] FLAG: --feature-gates="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071058 4778 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071065 4778 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071071 4778 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071078 4778 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071084 4778 flags.go:64] FLAG: --healthz-port="10248" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071091 4778 flags.go:64] FLAG: --help="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071097 4778 flags.go:64] FLAG: --hostname-override="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071103 4778 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071111 4778 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071119 4778 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071127 4778 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071134 4778 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071142 4778 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071149 4778 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071156 4778 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071164 4778 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071172 4778 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071181 4778 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071188 4778 flags.go:64] FLAG: --kube-reserved="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071196 4778 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071203 4778 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071212 4778 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071220 4778 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071226 4778 flags.go:64] FLAG: --lock-file="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071233 4778 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071239 4778 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071245 4778 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071258 4778 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071268 4778 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071276 4778 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071283 4778 flags.go:64] FLAG: --logging-format="text" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071292 4778 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071301 4778 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071309 4778 flags.go:64] FLAG: --manifest-url="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071317 4778 flags.go:64] FLAG: --manifest-url-header="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071327 4778 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071335 4778 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071343 4778 flags.go:64] FLAG: --max-pods="110" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071349 4778 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071356 4778 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071384 4778 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071391 4778 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071399 4778 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071405 4778 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071412 4778 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071428 4778 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071434 4778 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071441 4778 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071447 4778 flags.go:64] FLAG: --pod-cidr="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071454 4778 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071469 4778 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071477 4778 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071487 4778 flags.go:64] FLAG: --pods-per-core="0" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071495 4778 flags.go:64] FLAG: --port="10250" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071502 4778 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071510 4778 flags.go:64] FLAG: --provider-id="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071518 4778 flags.go:64] FLAG: --qos-reserved="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071526 4778 flags.go:64] FLAG: --read-only-port="10255" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071534 4778 flags.go:64] FLAG: --register-node="true" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071541 4778 flags.go:64] FLAG: --register-schedulable="true" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071548 4778 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071561 4778 flags.go:64] FLAG: --registry-burst="10" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071569 4778 flags.go:64] FLAG: --registry-qps="5" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071577 4778 flags.go:64] FLAG: --reserved-cpus="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071588 4778 flags.go:64] FLAG: --reserved-memory="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071598 4778 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071606 4778 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071614 4778 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071622 4778 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071630 4778 flags.go:64] FLAG: --runonce="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071637 4778 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071644 4778 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071652 4778 flags.go:64] FLAG: --seccomp-default="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071660 4778 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071668 4778 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071677 4778 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071686 4778 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071694 4778 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071702 4778 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071710 4778 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071717 4778 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071726 4778 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071735 4778 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071742 4778 flags.go:64] FLAG: --system-cgroups="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071750 4778 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071765 4778 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071774 4778 flags.go:64] FLAG: --tls-cert-file="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071781 4778 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071794 4778 flags.go:64] FLAG: --tls-min-version="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071802 4778 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071810 4778 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071817 4778 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071825 4778 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071832 4778 flags.go:64] FLAG: --v="2" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071843 4778 flags.go:64] FLAG: --version="false" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071853 4778 flags.go:64] FLAG: --vmodule="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071861 4778 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.071869 4778 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072097 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072107 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072115 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072121 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072128 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072135 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072141 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072148 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072155 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072161 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072167 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072173 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072178 4778 feature_gate.go:330] unrecognized feature gate: Example Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072184 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072190 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072195 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072200 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072205 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072211 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072216 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072222 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072232 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072237 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072243 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072249 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072254 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072259 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072264 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072270 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072276 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072282 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072287 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072293 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072298 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072304 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072309 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072315 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072320 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072326 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072334 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072340 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072346 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072353 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072360 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072390 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072397 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072403 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072410 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072416 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072423 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072430 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072437 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072445 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072456 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072463 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072470 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072477 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072484 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072490 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072496 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072501 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072507 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072513 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072518 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072523 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072528 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072533 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072538 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072543 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072548 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.072554 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.072571 4778 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.082838 4778 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.082889 4778 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083019 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083034 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083043 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083050 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083059 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083067 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083073 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083080 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083090 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083103 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083112 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083119 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083126 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083133 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083141 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083148 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083156 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083163 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083170 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083177 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083184 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083191 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083197 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083204 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083210 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083216 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083223 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083230 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083237 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083243 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083250 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083257 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083264 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083271 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083280 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083287 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083294 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083301 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083307 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083314 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083320 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083327 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083334 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083340 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083347 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083354 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083361 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083411 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083419 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083426 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083432 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083439 4778 feature_gate.go:330] unrecognized feature gate: Example Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083446 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083456 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083466 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083476 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083486 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083493 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083501 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083508 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083515 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083523 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083530 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083537 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083544 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083552 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083560 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083566 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083576 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083585 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083594 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.083607 4778 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083916 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083933 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083941 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083949 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083956 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083963 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083970 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.083977 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084051 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084061 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084068 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084075 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084082 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084089 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084096 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084103 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084111 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084119 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084160 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084170 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084179 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084190 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084199 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084207 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084215 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084224 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084232 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084240 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084249 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084257 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084266 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084274 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084283 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084291 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084300 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084308 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084316 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084323 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084330 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084337 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084344 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084352 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084360 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084393 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084401 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084408 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084417 4778 feature_gate.go:330] unrecognized feature gate: Example Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084424 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084431 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084440 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084448 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084456 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084463 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084471 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084477 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084484 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084491 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084497 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084503 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084510 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084517 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084524 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084531 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084538 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084545 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084552 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084559 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084566 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084573 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084580 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.084589 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.084602 4778 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.084971 4778 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.089299 4778 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.089512 4778 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.090295 4778 server.go:997] "Starting client certificate rotation" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.090330 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.091036 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-20 01:57:06.534893659 +0000 UTC Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.091161 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.098100 4778 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.098508 4778 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.100423 4778 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.111429 4778 log.go:25] "Validated CRI v1 runtime API" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.125972 4778 log.go:25] "Validated CRI v1 image API" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.127810 4778 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.131413 4778 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-05-15-51-24-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.131458 4778 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.159743 4778 manager.go:217] Machine: {Timestamp:2025-12-05 15:55:13.157518015 +0000 UTC m=+0.261314455 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d6159bdf-f1e9-405b-9393-3eae3aaf61c7 BootID:65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:97:5a:7a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:97:5a:7a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f8:bd:21 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:e8:4e:d8 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:3c:b6:72 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:81:80:7f Speed:-1 Mtu:1496} {Name:eth10 MacAddress:32:a1:ef:c9:b2:69 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:5a:72:37:ef:ba:f4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.160144 4778 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.160331 4778 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.160868 4778 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.161307 4778 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.161408 4778 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.161767 4778 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.161787 4778 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.162150 4778 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.162205 4778 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.162712 4778 state_mem.go:36] "Initialized new in-memory state store" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.162868 4778 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.164074 4778 kubelet.go:418] "Attempting to sync node with API server" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.164117 4778 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.164160 4778 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.164182 4778 kubelet.go:324] "Adding apiserver pod source" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.164203 4778 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.166542 4778 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.166949 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.167013 4778 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.166952 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.167062 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.167146 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.168224 4778 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169061 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169129 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169145 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169159 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169182 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169196 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169210 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169233 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169250 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169275 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169296 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169309 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.169661 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.172456 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.174007 4778 server.go:1280] "Started kubelet" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.174599 4778 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.174592 4778 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.175318 4778 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.176387 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.175946 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e5cc6ddf1b7bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 15:55:13.172699068 +0000 UTC m=+0.276495448,LastTimestamp:2025-12-05 15:55:13.172699068 +0000 UTC m=+0.276495448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.176436 4778 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.176640 4778 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 15:55:13 crc systemd[1]: Started Kubernetes Kubelet. Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.176745 4778 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.176793 4778 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.176925 4778 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.176490 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 16:40:44.516266661 +0000 UTC Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.176974 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 816h45m31.339299349s for next certificate rotation Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.178840 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="200ms" Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.178873 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.179268 4778 factory.go:55] Registering systemd factory Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.179304 4778 factory.go:221] Registration of the systemd container factory successfully Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.179549 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.179783 4778 server.go:460] "Adding debug handlers to kubelet server" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.179777 4778 factory.go:153] Registering CRI-O factory Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.180006 4778 factory.go:221] Registration of the crio container factory successfully Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.180115 4778 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.180162 4778 factory.go:103] Registering Raw factory Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.180189 4778 manager.go:1196] Started watching for new ooms in manager Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.181360 4778 manager.go:319] Starting recovery of all containers Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185560 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185827 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185839 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185852 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185863 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185874 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185884 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185895 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185906 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185916 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185926 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185938 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185948 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185960 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185973 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.185985 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186288 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186302 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186333 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186345 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186358 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186849 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186861 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186870 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186884 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186904 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186916 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186927 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186940 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186950 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186963 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186973 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186984 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.186994 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187005 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187015 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187030 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187039 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187050 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187059 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187069 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187078 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187089 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187099 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187110 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187121 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187130 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187140 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187153 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187190 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187201 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187227 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187246 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187265 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187279 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187292 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187305 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187319 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187332 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187343 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187352 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187378 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187388 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187397 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187408 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187439 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187450 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187459 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187470 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187479 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187486 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187495 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187504 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187512 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187520 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187529 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187539 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187548 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187557 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187585 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187595 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187604 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187614 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187624 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187639 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187654 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187662 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187671 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187681 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187689 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187701 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187710 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187721 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187773 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187783 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187796 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187806 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187816 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187829 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187838 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187849 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187861 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187871 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187883 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187931 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187944 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187955 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187965 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187975 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187986 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.187995 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188005 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188016 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188026 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188036 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188048 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188057 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188070 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188079 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188088 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188098 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188107 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188116 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188125 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188135 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188143 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188154 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188164 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188174 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188183 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188191 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188199 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188208 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188217 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188225 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188235 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188245 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188254 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188265 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188276 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188287 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188297 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188305 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188314 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188324 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188337 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188348 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188359 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188387 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188398 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188407 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188415 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188425 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188436 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188444 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188452 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188462 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188473 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188483 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188501 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188512 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.188564 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.194894 4778 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195056 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195108 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195156 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195547 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195599 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195623 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195648 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195767 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195826 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195841 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195875 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195890 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195909 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195921 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195935 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195952 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195963 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195977 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.195989 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196000 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196015 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196028 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196050 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196066 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196078 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196092 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196101 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196114 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196125 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196137 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196151 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196165 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196180 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196191 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196203 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196217 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196228 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196240 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196254 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196269 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196296 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196307 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196317 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196333 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196344 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196355 4778 reconstruct.go:97] "Volume reconstruction finished" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.196377 4778 reconciler.go:26] "Reconciler: start to sync state" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.213917 4778 manager.go:324] Recovery completed Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.226465 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.228427 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.228482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.228492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.229300 4778 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.229320 4778 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.229350 4778 state_mem.go:36] "Initialized new in-memory state store" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.241681 4778 policy_none.go:49] "None policy: Start" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.243083 4778 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.243128 4778 state_mem.go:35] "Initializing new in-memory state store" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.246003 4778 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.248179 4778 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.248217 4778 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.248257 4778 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.248307 4778 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.250465 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.250549 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.277536 4778 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.293109 4778 manager.go:334] "Starting Device Plugin manager" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.293936 4778 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.293965 4778 server.go:79] "Starting device plugin registration server" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.294618 4778 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.294642 4778 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.294926 4778 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.295054 4778 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.295072 4778 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.303730 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.349085 4778 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.349217 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.350483 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.350521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.350531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.350658 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.351175 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.351273 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.351513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.351564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.351578 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.351756 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.351983 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.352056 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.352705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.352725 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.352734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.353098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.353139 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.353152 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.353357 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.353397 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.353409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.353518 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.353692 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.353733 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.354207 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.354236 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.354246 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.354340 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.354459 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.354495 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.355068 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.355097 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.355108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.355696 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.355775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.355801 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.356055 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.356111 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.356132 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.356462 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.356523 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.357755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.357793 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.357808 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.381229 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="400ms" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.394947 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.396785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.396820 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.396833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.396859 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.397525 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398137 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398178 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398278 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398438 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398503 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398553 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398611 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398657 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398700 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398740 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398783 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398834 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.398913 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.399033 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.399124 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.499841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.499921 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.499961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.499993 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500026 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500054 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500085 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500111 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500109 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500149 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500183 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500179 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500221 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500233 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500252 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500287 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500281 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500306 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500226 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500291 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500411 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500334 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500289 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500447 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500343 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500486 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500555 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500563 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500593 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.500607 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.598456 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.600593 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.600684 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.600706 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.600753 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.601705 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.692488 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.719251 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.727192 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.728624 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-99439474c89be1f0f3ee06a5156c4f80a795089ee42773ec4bc921863fd3102b WatchSource:0}: Error finding container 99439474c89be1f0f3ee06a5156c4f80a795089ee42773ec4bc921863fd3102b: Status 404 returned error can't find the container with id 99439474c89be1f0f3ee06a5156c4f80a795089ee42773ec4bc921863fd3102b Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.749542 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.750148 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-c4211033151e2270b282df44ef2a36923bacfce0074cd7ecc25692ee303eeb3a WatchSource:0}: Error finding container c4211033151e2270b282df44ef2a36923bacfce0074cd7ecc25692ee303eeb3a: Status 404 returned error can't find the container with id c4211033151e2270b282df44ef2a36923bacfce0074cd7ecc25692ee303eeb3a Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.751968 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-2948bf168a115da70d9fc158f200e6b9523187ef0339527b0031b6b31994650e WatchSource:0}: Error finding container 2948bf168a115da70d9fc158f200e6b9523187ef0339527b0031b6b31994650e: Status 404 returned error can't find the container with id 2948bf168a115da70d9fc158f200e6b9523187ef0339527b0031b6b31994650e Dec 05 15:55:13 crc kubenswrapper[4778]: I1205 15:55:13.757785 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.759955 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-d5a4038c94c152aa489bc97d8906266be51f7066477cf3e31a69f7c956756a9c WatchSource:0}: Error finding container d5a4038c94c152aa489bc97d8906266be51f7066477cf3e31a69f7c956756a9c: Status 404 returned error can't find the container with id d5a4038c94c152aa489bc97d8906266be51f7066477cf3e31a69f7c956756a9c Dec 05 15:55:13 crc kubenswrapper[4778]: E1205 15:55:13.783217 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="800ms" Dec 05 15:55:13 crc kubenswrapper[4778]: W1205 15:55:13.792214 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-1c95fc06c766f55186b2866cc60d8073079bf88f52b90b7606943e498c6a4300 WatchSource:0}: Error finding container 1c95fc06c766f55186b2866cc60d8073079bf88f52b90b7606943e498c6a4300: Status 404 returned error can't find the container with id 1c95fc06c766f55186b2866cc60d8073079bf88f52b90b7606943e498c6a4300 Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.002428 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.004496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.004558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.004572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.004599 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 15:55:14 crc kubenswrapper[4778]: E1205 15:55:14.005549 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.173823 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.253960 4778 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03" exitCode=0 Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.254040 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03"} Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.254177 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1c95fc06c766f55186b2866cc60d8073079bf88f52b90b7606943e498c6a4300"} Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.254303 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.255945 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.255975 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.255985 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.257577 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1adeaaa8d776ade1d5b8f53e69b5fa90425305b12066b7322ad843f900dff086" exitCode=0 Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.257699 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1adeaaa8d776ade1d5b8f53e69b5fa90425305b12066b7322ad843f900dff086"} Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.257780 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d5a4038c94c152aa489bc97d8906266be51f7066477cf3e31a69f7c956756a9c"} Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.257909 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.260078 4778 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e" exitCode=0 Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.260258 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e"} Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.260290 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2948bf168a115da70d9fc158f200e6b9523187ef0339527b0031b6b31994650e"} Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.262024 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.264030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.264067 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.264081 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.264425 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.264465 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.264481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.266039 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e"} Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.266077 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c4211033151e2270b282df44ef2a36923bacfce0074cd7ecc25692ee303eeb3a"} Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.270463 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d" exitCode=0 Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.270517 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d"} Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.270556 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"99439474c89be1f0f3ee06a5156c4f80a795089ee42773ec4bc921863fd3102b"} Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.270677 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.272044 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.272165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.272184 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.274781 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.275558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.275594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.275605 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:14 crc kubenswrapper[4778]: W1205 15:55:14.384069 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Dec 05 15:55:14 crc kubenswrapper[4778]: E1205 15:55:14.384184 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Dec 05 15:55:14 crc kubenswrapper[4778]: E1205 15:55:14.473919 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e5cc6ddf1b7bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 15:55:13.172699068 +0000 UTC m=+0.276495448,LastTimestamp:2025-12-05 15:55:13.172699068 +0000 UTC m=+0.276495448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 15:55:14 crc kubenswrapper[4778]: E1205 15:55:14.584718 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="1.6s" Dec 05 15:55:14 crc kubenswrapper[4778]: W1205 15:55:14.689091 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Dec 05 15:55:14 crc kubenswrapper[4778]: E1205 15:55:14.689194 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Dec 05 15:55:14 crc kubenswrapper[4778]: W1205 15:55:14.705641 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Dec 05 15:55:14 crc kubenswrapper[4778]: E1205 15:55:14.705742 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.806469 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.809323 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.809367 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.809503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:14 crc kubenswrapper[4778]: I1205 15:55:14.809540 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 15:55:14 crc kubenswrapper[4778]: E1205 15:55:14.810615 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.239790 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.277884 4778 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1" exitCode=0 Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.277994 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1"} Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.278187 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.279632 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.279809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.280068 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.284930 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d48dc7e2c7a5f932f9c2725295043f73520ff180a527396540576889d6537755"} Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.285061 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.286047 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.286067 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.286076 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.289158 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"acf9869dc28a932f93dc17e7251a9439cc0800654b657778a019904dea1065ba"} Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.289416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c8be3eb2a66916642073568cde0334019c613c661e21540feaeeaba10a686477"} Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.289610 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.289616 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"46dfbfa7cc094d6fe37248fcff273d561a58dfce35ed231a303e4bd70cf0fbd7"} Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.292098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.292141 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.292167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.296483 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4"} Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.296703 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c"} Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.296887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3"} Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.296497 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.298576 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.298601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.298613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.300131 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5"} Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.300349 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63"} Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.300684 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907"} Dec 05 15:55:15 crc kubenswrapper[4778]: I1205 15:55:15.300839 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258"} Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.307498 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe"} Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.307558 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.309754 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.309827 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.309853 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.309846 4778 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530" exitCode=0 Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.310009 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.310073 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.310203 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530"} Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.310352 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.310991 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.311587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.311623 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.311637 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.312504 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.312561 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.312580 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.312636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.312653 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.312664 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.411862 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.414197 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.414277 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.414297 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.414353 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 15:55:16 crc kubenswrapper[4778]: I1205 15:55:16.437544 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.035335 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.317875 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481"} Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.317957 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925"} Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.317993 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827"} Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.318001 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.317996 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.318084 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.319534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.319574 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.319592 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.319892 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.319936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.319949 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.862924 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.863255 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.865261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.865322 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:17 crc kubenswrapper[4778]: I1205 15:55:17.865341 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:18 crc kubenswrapper[4778]: I1205 15:55:18.326145 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c"} Dec 05 15:55:18 crc kubenswrapper[4778]: I1205 15:55:18.326237 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60"} Dec 05 15:55:18 crc kubenswrapper[4778]: I1205 15:55:18.326166 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 15:55:18 crc kubenswrapper[4778]: I1205 15:55:18.326272 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:18 crc kubenswrapper[4778]: I1205 15:55:18.326333 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:18 crc kubenswrapper[4778]: I1205 15:55:18.327777 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:18 crc kubenswrapper[4778]: I1205 15:55:18.327797 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:18 crc kubenswrapper[4778]: I1205 15:55:18.327821 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:18 crc kubenswrapper[4778]: I1205 15:55:18.327836 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:18 crc kubenswrapper[4778]: I1205 15:55:18.327857 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:18 crc kubenswrapper[4778]: I1205 15:55:18.327840 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:18 crc kubenswrapper[4778]: I1205 15:55:18.999754 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.000082 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.002352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.002463 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.002497 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.326023 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.332194 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.333812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.333867 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.333886 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.453266 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.453566 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.453630 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.455348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.455461 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:19 crc kubenswrapper[4778]: I1205 15:55:19.455481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:20 crc kubenswrapper[4778]: I1205 15:55:20.336655 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:20 crc kubenswrapper[4778]: I1205 15:55:20.338343 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:20 crc kubenswrapper[4778]: I1205 15:55:20.338448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:20 crc kubenswrapper[4778]: I1205 15:55:20.338474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:21 crc kubenswrapper[4778]: I1205 15:55:21.402008 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:21 crc kubenswrapper[4778]: I1205 15:55:21.402854 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:21 crc kubenswrapper[4778]: I1205 15:55:21.404853 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:21 crc kubenswrapper[4778]: I1205 15:55:21.404969 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:21 crc kubenswrapper[4778]: I1205 15:55:21.404995 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:22 crc kubenswrapper[4778]: I1205 15:55:22.000330 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 15:55:22 crc kubenswrapper[4778]: I1205 15:55:22.000512 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 15:55:22 crc kubenswrapper[4778]: I1205 15:55:22.541841 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:22 crc kubenswrapper[4778]: I1205 15:55:22.542122 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:22 crc kubenswrapper[4778]: I1205 15:55:22.548595 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:22 crc kubenswrapper[4778]: I1205 15:55:22.548695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:22 crc kubenswrapper[4778]: I1205 15:55:22.548724 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:22 crc kubenswrapper[4778]: I1205 15:55:22.551610 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:23 crc kubenswrapper[4778]: E1205 15:55:23.304090 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 15:55:23 crc kubenswrapper[4778]: I1205 15:55:23.345659 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:23 crc kubenswrapper[4778]: I1205 15:55:23.347189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:23 crc kubenswrapper[4778]: I1205 15:55:23.347259 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:23 crc kubenswrapper[4778]: I1205 15:55:23.347279 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:24 crc kubenswrapper[4778]: I1205 15:55:24.214764 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:24 crc kubenswrapper[4778]: I1205 15:55:24.349001 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:24 crc kubenswrapper[4778]: I1205 15:55:24.350971 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:24 crc kubenswrapper[4778]: I1205 15:55:24.351066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:24 crc kubenswrapper[4778]: I1205 15:55:24.351091 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:24 crc kubenswrapper[4778]: I1205 15:55:24.355861 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:24 crc kubenswrapper[4778]: I1205 15:55:24.684060 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 05 15:55:24 crc kubenswrapper[4778]: I1205 15:55:24.684284 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:24 crc kubenswrapper[4778]: I1205 15:55:24.685612 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:24 crc kubenswrapper[4778]: I1205 15:55:24.685661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:24 crc kubenswrapper[4778]: I1205 15:55:24.685672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:24 crc kubenswrapper[4778]: W1205 15:55:24.827704 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 15:55:24 crc kubenswrapper[4778]: I1205 15:55:24.828071 4778 trace.go:236] Trace[1744362378]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 15:55:14.825) (total time: 10002ms): Dec 05 15:55:24 crc kubenswrapper[4778]: Trace[1744362378]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:55:24.827) Dec 05 15:55:24 crc kubenswrapper[4778]: Trace[1744362378]: [10.002054799s] [10.002054799s] END Dec 05 15:55:24 crc kubenswrapper[4778]: E1205 15:55:24.828267 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 15:55:25 crc kubenswrapper[4778]: I1205 15:55:25.174793 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 05 15:55:25 crc kubenswrapper[4778]: E1205 15:55:25.247337 4778 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 15:55:25 crc kubenswrapper[4778]: I1205 15:55:25.351965 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:25 crc kubenswrapper[4778]: I1205 15:55:25.353575 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:25 crc kubenswrapper[4778]: I1205 15:55:25.353638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:25 crc kubenswrapper[4778]: I1205 15:55:25.353651 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:26 crc kubenswrapper[4778]: I1205 15:55:26.089180 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 15:55:26 crc kubenswrapper[4778]: I1205 15:55:26.089290 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 15:55:26 crc kubenswrapper[4778]: I1205 15:55:26.097148 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 15:55:26 crc kubenswrapper[4778]: I1205 15:55:26.097595 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 15:55:27 crc kubenswrapper[4778]: I1205 15:55:27.045344 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]log ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]etcd ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/priority-and-fairness-filter ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/start-apiextensions-informers ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/start-apiextensions-controllers ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/crd-informer-synced ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/start-system-namespaces-controller ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 05 15:55:27 crc kubenswrapper[4778]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 05 15:55:27 crc kubenswrapper[4778]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/bootstrap-controller ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/start-kube-aggregator-informers ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/apiservice-registration-controller ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/apiservice-discovery-controller ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]autoregister-completion ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/apiservice-openapi-controller ok Dec 05 15:55:27 crc kubenswrapper[4778]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 05 15:55:27 crc kubenswrapper[4778]: livez check failed Dec 05 15:55:27 crc kubenswrapper[4778]: I1205 15:55:27.045527 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:55:29 crc kubenswrapper[4778]: I1205 15:55:29.455442 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 15:55:29 crc kubenswrapper[4778]: I1205 15:55:29.475742 4778 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.088860 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.092124 4778 trace.go:236] Trace[142118386]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 15:55:17.510) (total time: 13581ms): Dec 05 15:55:31 crc kubenswrapper[4778]: Trace[142118386]: ---"Objects listed" error: 13581ms (15:55:31.091) Dec 05 15:55:31 crc kubenswrapper[4778]: Trace[142118386]: [13.581518374s] [13.581518374s] END Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.092176 4778 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.093282 4778 trace.go:236] Trace[1832935184]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 15:55:16.230) (total time: 14862ms): Dec 05 15:55:31 crc kubenswrapper[4778]: Trace[1832935184]: ---"Objects listed" error: 14861ms (15:55:31.092) Dec 05 15:55:31 crc kubenswrapper[4778]: Trace[1832935184]: [14.86225092s] [14.86225092s] END Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.093322 4778 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.093431 4778 trace.go:236] Trace[1820484940]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 15:55:16.600) (total time: 14492ms): Dec 05 15:55:31 crc kubenswrapper[4778]: Trace[1820484940]: ---"Objects listed" error: 14492ms (15:55:31.093) Dec 05 15:55:31 crc kubenswrapper[4778]: Trace[1820484940]: [14.492906113s] [14.492906113s] END Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.093471 4778 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.094041 4778 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.095717 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.166931 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.172293 4778 apiserver.go:52] "Watching apiserver" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.175176 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.176038 4778 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.176412 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.176820 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.176977 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.176994 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.177173 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.177322 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.177497 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.177333 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.177577 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.177656 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.181100 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.181119 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.181106 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.181603 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.181891 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.181906 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.184203 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.184458 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.184568 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.187151 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.194416 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.194555 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.194626 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.194701 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.194789 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.194865 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.194936 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.195000 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.197443 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.197514 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.197543 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.197566 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.197583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.197604 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.197689 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.197756 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:31.697733208 +0000 UTC m=+18.801529588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.198257 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60988->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.198270 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.198297 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60988->192.168.126.11:17697: read: connection reset by peer" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.198556 4778 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.198946 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.199147 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.199225 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:31.699204208 +0000 UTC m=+18.803000598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.210455 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.212142 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.214738 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.214774 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.214795 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.214860 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:31.714839089 +0000 UTC m=+18.818635479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.218853 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.219849 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.221194 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.221724 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.221759 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.221800 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.221902 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:31.721877258 +0000 UTC m=+18.825673658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.223529 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.224138 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.236434 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.241973 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.251798 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.273611 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.277984 4778 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.293539 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301672 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301716 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301735 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301751 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301768 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301783 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301797 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301814 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301830 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301846 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301862 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301879 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301893 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301909 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301924 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301942 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301955 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301969 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.301993 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302007 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302023 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302037 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302052 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302069 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302093 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302125 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302141 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302154 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302185 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302200 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302219 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302235 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302251 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302284 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302299 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302315 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302344 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302358 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302411 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302434 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302450 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302464 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302478 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302491 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302506 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302521 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302534 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302549 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302563 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302576 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302591 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302605 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302618 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302633 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302646 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302663 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302678 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302695 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302711 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302724 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302738 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302752 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302765 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302780 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302795 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302809 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302827 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302843 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302857 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302872 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302887 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302901 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302917 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302933 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302947 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302963 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302978 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.302993 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303008 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303028 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303043 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303058 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303075 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303090 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303105 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303120 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303135 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303149 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303164 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303179 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303195 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303210 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303224 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303238 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303254 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303269 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303285 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303299 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303317 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303333 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303350 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303382 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303398 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303415 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303430 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303446 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303461 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303475 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303489 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303504 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303521 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303535 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303550 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303566 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303581 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303597 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303611 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303640 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303655 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303670 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303685 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303704 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303797 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303816 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303832 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303846 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303861 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303886 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303902 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303919 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303950 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303965 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.303989 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304004 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304031 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304072 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304096 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304113 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304128 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304144 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304160 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304151 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304175 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304258 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304280 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304304 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304323 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304340 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304359 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304397 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304415 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304431 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304447 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304468 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304486 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304503 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304520 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304540 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304558 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304576 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304593 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304610 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304629 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304625 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304651 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304670 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304687 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304705 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304721 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304740 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304762 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304780 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304800 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304817 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304799 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304834 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304869 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304908 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304936 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304957 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304977 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.304997 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305019 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305038 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305057 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305060 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305074 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305095 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305119 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305143 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305163 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305180 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305200 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305220 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305236 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305254 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305280 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305304 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305324 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305342 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305346 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305450 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305509 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305572 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305625 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305638 4778 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305652 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305664 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305675 4778 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305687 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305700 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305772 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305858 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305880 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305864 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.305893 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.309471 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.310055 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.310236 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.310957 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.311100 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.311232 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.311673 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.311815 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.311951 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.312088 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.312219 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.312420 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.312580 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.312707 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.312873 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.313793 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:55:31.813770383 +0000 UTC m=+18.917566763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.314244 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.314568 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.314729 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.322992 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.323873 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.324593 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.324766 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.324766 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.325005 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.325031 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.325182 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.325226 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.325345 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.325474 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.325257 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.325838 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.326039 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.326184 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.326629 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.326650 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.326865 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.327106 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.327638 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.327717 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.327977 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.327996 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.328126 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.328550 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.328828 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.329019 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.329319 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.329515 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.330588 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.330639 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.330705 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.331930 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.330617 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.333205 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.334563 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.334822 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.334968 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.337186 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.337185 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.337344 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.337556 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.337598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.338166 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.338170 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.338445 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.338544 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.338793 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.338995 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.339266 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.339356 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.339356 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.339540 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.339575 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.339978 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.340013 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.340222 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.340502 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.340680 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.343934 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.343984 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.344091 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.344332 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.344412 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.344972 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.345131 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.345149 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.345334 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.345340 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.345340 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.340447 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.345542 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.345558 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.345611 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.345682 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.346162 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.346704 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.347005 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.347746 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.347838 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.348026 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.348159 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.348276 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.348425 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.348500 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.348754 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.348837 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.348866 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.348895 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.349111 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.349127 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.349186 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.349351 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.349487 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.349659 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.349930 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.349947 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.350900 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.351805 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.353151 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.353241 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.353302 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.353497 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.353612 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.353672 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.353882 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.353890 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.353939 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.353983 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.354353 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.354449 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.354517 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.354555 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.354602 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.354637 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.354765 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.355861 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.356308 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.356410 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.356546 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.356826 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.356933 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.356867 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.357359 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.357685 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.357782 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.357813 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.357837 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.357878 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.357869 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.357908 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.357959 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.358294 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.358383 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.358385 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.358457 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.358639 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.358642 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.358816 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.359563 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.359875 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.360071 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.360105 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.360208 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.360740 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.360797 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.360956 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.361256 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.361489 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.361546 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.361586 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.361629 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.361357 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.361695 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.361951 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.361964 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.362042 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.362108 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.362145 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.362085 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.363557 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.364394 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.364448 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.368684 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.373517 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"56380bf0980b7c5146997a460fd829a7555d985a540e343cc8a5ae7ac2da51d2"} Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.375510 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.376789 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.377357 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.380592 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe" exitCode=255 Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.380690 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe"} Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.388642 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.389488 4778 scope.go:117] "RemoveContainer" containerID="f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.390251 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.395730 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.400765 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.405545 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406235 4778 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406259 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406270 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406280 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406289 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406296 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406305 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406313 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406321 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406329 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406337 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406345 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406355 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406378 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406389 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406397 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406404 4778 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406412 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406420 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406428 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406436 4778 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406444 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406452 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406460 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406467 4778 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406477 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406487 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406495 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406505 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406513 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406520 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406529 4778 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406562 4778 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406575 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406585 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406596 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406609 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406704 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406715 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406726 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406760 4778 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406770 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406780 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406788 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406796 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406805 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406814 4778 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406822 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406831 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406839 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406848 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406881 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406890 4778 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406899 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406907 4778 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406916 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406925 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406938 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406946 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406955 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406963 4778 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406972 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406980 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406989 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.406997 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407006 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407015 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407024 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407032 4778 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407584 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407598 4778 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407608 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407618 4778 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407627 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407635 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407642 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407650 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407661 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407752 4778 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407764 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407774 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407782 4778 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407852 4778 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407861 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407869 4778 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407896 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407905 4778 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407913 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407922 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.407932 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408303 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408322 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408332 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408340 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408348 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408355 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408385 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408395 4778 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408412 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408420 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408428 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408437 4778 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408444 4778 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408454 4778 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408462 4778 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408469 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408477 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408486 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408494 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408502 4778 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408510 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408520 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408527 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408536 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408543 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408551 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408558 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408568 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408576 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408585 4778 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408593 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408601 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408609 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408618 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408626 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408634 4778 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408642 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408668 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408676 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408685 4778 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408694 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408703 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408711 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408720 4778 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408728 4778 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408736 4778 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408745 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408753 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408762 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408772 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408780 4778 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408793 4778 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408817 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408829 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408839 4778 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408851 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408863 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408875 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408886 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408896 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408906 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408917 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408927 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408937 4778 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408947 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408957 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408968 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408978 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408989 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.408999 4778 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409008 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409016 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409024 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409147 4778 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409156 4778 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409166 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409175 4778 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409183 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409191 4778 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409201 4778 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409210 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409221 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409228 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409236 4778 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409246 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409260 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409268 4778 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409277 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409285 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409293 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409301 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409310 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409318 4778 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409326 4778 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409334 4778 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409342 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409350 4778 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.409358 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.419776 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.433542 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.442271 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.450940 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.459347 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.508956 4778 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.516045 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 15:55:31 crc kubenswrapper[4778]: W1205 15:55:31.529507 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-639e0fad0fc642264b32f17cdeae07290e4ae8f13d0cb472eceb1ad1d6690dfe WatchSource:0}: Error finding container 639e0fad0fc642264b32f17cdeae07290e4ae8f13d0cb472eceb1ad1d6690dfe: Status 404 returned error can't find the container with id 639e0fad0fc642264b32f17cdeae07290e4ae8f13d0cb472eceb1ad1d6690dfe Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.552860 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 15:55:31 crc kubenswrapper[4778]: W1205 15:55:31.576947 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-457bc46c52816435f1255dde152fcc0f01cd212077761a2e2e27dbef465a8513 WatchSource:0}: Error finding container 457bc46c52816435f1255dde152fcc0f01cd212077761a2e2e27dbef465a8513: Status 404 returned error can't find the container with id 457bc46c52816435f1255dde152fcc0f01cd212077761a2e2e27dbef465a8513 Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.714080 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.714193 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.714300 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.714327 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.714390 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:32.714354809 +0000 UTC m=+19.818151189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.714455 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:32.714434351 +0000 UTC m=+19.818230731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.815135 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.815219 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:31 crc kubenswrapper[4778]: I1205 15:55:31.815242 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.815399 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.815417 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.815428 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.815423 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:55:32.815380799 +0000 UTC m=+19.919177199 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.815477 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:32.815463432 +0000 UTC m=+19.919259812 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.815513 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.815571 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.815588 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:31 crc kubenswrapper[4778]: E1205 15:55:31.815731 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:32.815686878 +0000 UTC m=+19.919483268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.040342 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.061901 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.078106 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.094232 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.113830 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.128285 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.143277 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.157088 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.168503 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.249410 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.249591 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.385752 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e"} Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.385891 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"639e0fad0fc642264b32f17cdeae07290e4ae8f13d0cb472eceb1ad1d6690dfe"} Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.387553 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.389509 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8"} Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.390358 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.391942 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c"} Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.391976 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8"} Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.394380 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"457bc46c52816435f1255dde152fcc0f01cd212077761a2e2e27dbef465a8513"} Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.395250 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.404381 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.410255 4778 csr.go:261] certificate signing request csr-gk55w is approved, waiting to be issued Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.423603 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.424333 4778 csr.go:257] certificate signing request csr-gk55w is issued Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.445518 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.459383 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.488873 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.514819 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.542358 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.563800 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.589156 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.604795 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.633793 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.670886 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.693484 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.711890 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.724206 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.724314 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.724421 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.724530 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:34.724511348 +0000 UTC m=+21.828307728 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.724529 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.724623 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:34.724604881 +0000 UTC m=+21.828401261 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.730871 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.747399 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.825120 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.825209 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.825275 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.825281 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:55:34.825258301 +0000 UTC m=+21.929054681 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.825449 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.825467 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.825480 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.825532 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:34.825520888 +0000 UTC m=+21.929317278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.825545 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.825610 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.825634 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:32 crc kubenswrapper[4778]: E1205 15:55:32.825743 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:34.825715334 +0000 UTC m=+21.929511744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.919313 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-xm5sq"] Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.919690 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xm5sq" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.921297 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.921498 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.921940 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.938922 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.954531 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.968934 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.981062 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:32 crc kubenswrapper[4778]: I1205 15:55:32.997326 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:32Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.020590 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.027003 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca6b2f2c-9429-4e93-b595-38d5ac9e0d57-hosts-file\") pod \"node-resolver-xm5sq\" (UID: \"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\") " pod="openshift-dns/node-resolver-xm5sq" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.027061 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7j8p\" (UniqueName: \"kubernetes.io/projected/ca6b2f2c-9429-4e93-b595-38d5ac9e0d57-kube-api-access-b7j8p\") pod \"node-resolver-xm5sq\" (UID: \"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\") " pod="openshift-dns/node-resolver-xm5sq" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.058630 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.091894 4778 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 05 15:55:33 crc kubenswrapper[4778]: W1205 15:55:33.092246 4778 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.092284 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c/status\": read tcp 38.102.83.130:36540->38.102.83.130:6443: use of closed network connection" Dec 05 15:55:33 crc kubenswrapper[4778]: W1205 15:55:33.092387 4778 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 15:55:33 crc kubenswrapper[4778]: W1205 15:55:33.092500 4778 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.128014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca6b2f2c-9429-4e93-b595-38d5ac9e0d57-hosts-file\") pod \"node-resolver-xm5sq\" (UID: \"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\") " pod="openshift-dns/node-resolver-xm5sq" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.128080 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7j8p\" (UniqueName: \"kubernetes.io/projected/ca6b2f2c-9429-4e93-b595-38d5ac9e0d57-kube-api-access-b7j8p\") pod \"node-resolver-xm5sq\" (UID: \"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\") " pod="openshift-dns/node-resolver-xm5sq" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.128125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca6b2f2c-9429-4e93-b595-38d5ac9e0d57-hosts-file\") pod \"node-resolver-xm5sq\" (UID: \"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\") " pod="openshift-dns/node-resolver-xm5sq" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.129748 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.156840 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7j8p\" (UniqueName: \"kubernetes.io/projected/ca6b2f2c-9429-4e93-b595-38d5ac9e0d57-kube-api-access-b7j8p\") pod \"node-resolver-xm5sq\" (UID: \"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\") " pod="openshift-dns/node-resolver-xm5sq" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.236512 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xm5sq" Dec 05 15:55:33 crc kubenswrapper[4778]: W1205 15:55:33.247971 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca6b2f2c_9429_4e93_b595_38d5ac9e0d57.slice/crio-05ef2d42583726cd26391e4b0530362c325f3214b47004c19b47cd1052047b89 WatchSource:0}: Error finding container 05ef2d42583726cd26391e4b0530362c325f3214b47004c19b47cd1052047b89: Status 404 returned error can't find the container with id 05ef2d42583726cd26391e4b0530362c325f3214b47004c19b47cd1052047b89 Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.248440 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.248526 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:33 crc kubenswrapper[4778]: E1205 15:55:33.248553 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:33 crc kubenswrapper[4778]: E1205 15:55:33.248653 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.253043 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.254068 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.255615 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.256462 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.266278 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.266799 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.267591 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.268097 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.268718 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.269226 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.270925 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.271599 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.272473 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.272956 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.273845 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.274329 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.274893 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.275727 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.276260 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.277211 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.277687 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.278202 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.279190 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.279847 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.285459 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.286100 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.287284 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.287558 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.287844 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.288394 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.289211 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.289670 4778 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.289768 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.291765 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.292228 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.292626 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.295150 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.295797 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.296672 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.297267 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.297585 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.298277 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.298764 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.299340 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.300276 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.301190 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.301688 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.302714 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.303206 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.306473 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.307035 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.307557 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.308583 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.309175 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.310176 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.310724 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.314671 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.320028 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-nrqmz"] Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.320238 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jqrsw"] Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.320473 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.320749 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.323537 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-gvdqh"] Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.324157 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.324990 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.325184 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.325431 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.325569 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.325663 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.325764 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.325893 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.326111 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.326300 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.327250 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.329738 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.329881 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.334217 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.350976 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.363078 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.378612 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.394986 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.401131 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xm5sq" event={"ID":"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57","Type":"ContainerStarted","Data":"05ef2d42583726cd26391e4b0530362c325f3214b47004c19b47cd1052047b89"} Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.408270 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.422284 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.425109 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-05 15:50:32 +0000 UTC, rotation deadline is 2026-09-24 05:29:36.592822715 +0000 UTC Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.425192 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7021h34m3.167637469s for next certificate rotation Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430130 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-multus-cni-dir\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430186 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-var-lib-kubelet\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430210 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-etc-kubernetes\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430235 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e780ff27-1d00-4280-8e7e-9eb9fe3dea6e-proxy-tls\") pod \"machine-config-daemon-jqrsw\" (UID: \"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\") " pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430258 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2c63eba1-fb5c-431f-beed-0e81832f7e21-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430283 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-var-lib-cni-multus\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5ggh\" (UniqueName: \"kubernetes.io/projected/2c63eba1-fb5c-431f-beed-0e81832f7e21-kube-api-access-x5ggh\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430329 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-multus-conf-dir\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430351 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-cnibin\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430398 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg8xf\" (UniqueName: \"kubernetes.io/projected/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-kube-api-access-fg8xf\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430450 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2c63eba1-fb5c-431f-beed-0e81832f7e21-system-cni-dir\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430474 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e780ff27-1d00-4280-8e7e-9eb9fe3dea6e-mcd-auth-proxy-config\") pod \"machine-config-daemon-jqrsw\" (UID: \"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\") " pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430507 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-os-release\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430543 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2c63eba1-fb5c-431f-beed-0e81832f7e21-cni-binary-copy\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430569 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-cni-binary-copy\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430596 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2c63eba1-fb5c-431f-beed-0e81832f7e21-os-release\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430620 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2c63eba1-fb5c-431f-beed-0e81832f7e21-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430661 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-run-multus-certs\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430687 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-run-netns\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430710 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6df77\" (UniqueName: \"kubernetes.io/projected/e780ff27-1d00-4280-8e7e-9eb9fe3dea6e-kube-api-access-6df77\") pod \"machine-config-daemon-jqrsw\" (UID: \"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\") " pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430731 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-system-cni-dir\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430771 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-hostroot\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430808 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-multus-daemon-config\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430829 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-multus-socket-dir-parent\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430853 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-var-lib-cni-bin\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430878 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2c63eba1-fb5c-431f-beed-0e81832f7e21-cnibin\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430903 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e780ff27-1d00-4280-8e7e-9eb9fe3dea6e-rootfs\") pod \"machine-config-daemon-jqrsw\" (UID: \"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\") " pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.430968 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-run-k8s-cni-cncf-io\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.436941 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.449010 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.466758 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.478762 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.495325 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.509267 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.521780 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532419 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-cnibin\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532479 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg8xf\" (UniqueName: \"kubernetes.io/projected/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-kube-api-access-fg8xf\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532520 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2c63eba1-fb5c-431f-beed-0e81832f7e21-system-cni-dir\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532542 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e780ff27-1d00-4280-8e7e-9eb9fe3dea6e-mcd-auth-proxy-config\") pod \"machine-config-daemon-jqrsw\" (UID: \"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\") " pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532564 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-os-release\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532597 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2c63eba1-fb5c-431f-beed-0e81832f7e21-cni-binary-copy\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532618 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-cni-binary-copy\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532624 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-cnibin\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532638 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2c63eba1-fb5c-431f-beed-0e81832f7e21-os-release\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532731 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2c63eba1-fb5c-431f-beed-0e81832f7e21-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532776 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-run-multus-certs\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-run-netns\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532828 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6df77\" (UniqueName: \"kubernetes.io/projected/e780ff27-1d00-4280-8e7e-9eb9fe3dea6e-kube-api-access-6df77\") pod \"machine-config-daemon-jqrsw\" (UID: \"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\") " pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532844 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-system-cni-dir\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532861 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-hostroot\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532878 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-multus-daemon-config\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532903 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-multus-socket-dir-parent\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532922 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-var-lib-cni-bin\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532939 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2c63eba1-fb5c-431f-beed-0e81832f7e21-cnibin\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532955 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-run-k8s-cni-cncf-io\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.532976 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e780ff27-1d00-4280-8e7e-9eb9fe3dea6e-rootfs\") pod \"machine-config-daemon-jqrsw\" (UID: \"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\") " pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533003 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-multus-cni-dir\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533019 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-var-lib-kubelet\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533035 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-etc-kubernetes\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533055 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e780ff27-1d00-4280-8e7e-9eb9fe3dea6e-proxy-tls\") pod \"machine-config-daemon-jqrsw\" (UID: \"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\") " pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533078 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2c63eba1-fb5c-431f-beed-0e81832f7e21-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533097 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-var-lib-cni-multus\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533115 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5ggh\" (UniqueName: \"kubernetes.io/projected/2c63eba1-fb5c-431f-beed-0e81832f7e21-kube-api-access-x5ggh\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533143 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-multus-conf-dir\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533194 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-multus-conf-dir\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533233 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-multus-socket-dir-parent\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533257 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-var-lib-cni-bin\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2c63eba1-fb5c-431f-beed-0e81832f7e21-cnibin\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533299 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-run-k8s-cni-cncf-io\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533321 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e780ff27-1d00-4280-8e7e-9eb9fe3dea6e-rootfs\") pod \"machine-config-daemon-jqrsw\" (UID: \"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\") " pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.534497 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-var-lib-kubelet\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.533035 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2c63eba1-fb5c-431f-beed-0e81832f7e21-os-release\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.534542 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-etc-kubernetes\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.534563 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-multus-cni-dir\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.534596 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-var-lib-cni-multus\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.534901 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2c63eba1-fb5c-431f-beed-0e81832f7e21-system-cni-dir\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.535047 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-multus-daemon-config\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.535127 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-system-cni-dir\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.535152 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-run-netns\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.535403 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-os-release\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.535572 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2c63eba1-fb5c-431f-beed-0e81832f7e21-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.535627 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-hostroot\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.536695 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-host-run-multus-certs\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.536984 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2c63eba1-fb5c-431f-beed-0e81832f7e21-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.537100 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e780ff27-1d00-4280-8e7e-9eb9fe3dea6e-mcd-auth-proxy-config\") pod \"machine-config-daemon-jqrsw\" (UID: \"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\") " pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.537267 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-cni-binary-copy\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.538080 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2c63eba1-fb5c-431f-beed-0e81832f7e21-cni-binary-copy\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.538943 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.544492 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e780ff27-1d00-4280-8e7e-9eb9fe3dea6e-proxy-tls\") pod \"machine-config-daemon-jqrsw\" (UID: \"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\") " pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.554537 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.556292 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6df77\" (UniqueName: \"kubernetes.io/projected/e780ff27-1d00-4280-8e7e-9eb9fe3dea6e-kube-api-access-6df77\") pod \"machine-config-daemon-jqrsw\" (UID: \"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\") " pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.558093 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5ggh\" (UniqueName: \"kubernetes.io/projected/2c63eba1-fb5c-431f-beed-0e81832f7e21-kube-api-access-x5ggh\") pod \"multus-additional-cni-plugins-gvdqh\" (UID: \"2c63eba1-fb5c-431f-beed-0e81832f7e21\") " pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.563344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg8xf\" (UniqueName: \"kubernetes.io/projected/9b26d99a-f08e-41d1-b35c-5da99cbe3fb4-kube-api-access-fg8xf\") pod \"multus-nrqmz\" (UID: \"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\") " pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.574136 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.588219 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.639054 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.647437 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nrqmz" Dec 05 15:55:33 crc kubenswrapper[4778]: W1205 15:55:33.651416 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode780ff27_1d00_4280_8e7e_9eb9fe3dea6e.slice/crio-b8ff38ad3f1e50430c497785d31333f86e65dac880c3686bbb1e98ad0e13cc86 WatchSource:0}: Error finding container b8ff38ad3f1e50430c497785d31333f86e65dac880c3686bbb1e98ad0e13cc86: Status 404 returned error can't find the container with id b8ff38ad3f1e50430c497785d31333f86e65dac880c3686bbb1e98ad0e13cc86 Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.654034 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" Dec 05 15:55:33 crc kubenswrapper[4778]: W1205 15:55:33.662772 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b26d99a_f08e_41d1_b35c_5da99cbe3fb4.slice/crio-930994e3c648eb2e0f00f7dac1c86b2db93d63478bbe1098001989924aac6047 WatchSource:0}: Error finding container 930994e3c648eb2e0f00f7dac1c86b2db93d63478bbe1098001989924aac6047: Status 404 returned error can't find the container with id 930994e3c648eb2e0f00f7dac1c86b2db93d63478bbe1098001989924aac6047 Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.702343 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vzs5q"] Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.703312 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.707869 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.708110 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.708456 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.709315 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.709460 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.709681 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.709785 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.740947 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.768546 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.780304 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.795605 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.811051 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.826730 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-openvswitch\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836344 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-node-log\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836387 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836503 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-systemd-units\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836567 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-var-lib-openvswitch\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836600 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836621 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-cni-bin\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836759 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-ovn\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836806 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czbcx\" (UniqueName: \"kubernetes.io/projected/6837b168-c691-4e7e-a211-a0c8ef0534e2-kube-api-access-czbcx\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836865 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-env-overrides\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836895 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-kubelet\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836920 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-slash\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836939 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovn-node-metrics-cert\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836961 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovnkube-script-lib\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.836990 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-run-netns\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.837007 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-log-socket\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.837026 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-etc-openvswitch\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.837043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovnkube-config\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.837060 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-cni-netd\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.837091 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-systemd\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.840183 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.853332 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.866689 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.877611 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.894532 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.907170 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.927759 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939015 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-env-overrides\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939062 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-kubelet\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939106 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-slash\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939128 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovn-node-metrics-cert\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939422 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-slash\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939485 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-kubelet\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939518 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovnkube-script-lib\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939548 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-run-netns\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939597 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-log-socket\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939626 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-etc-openvswitch\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939670 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovnkube-config\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939695 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-systemd\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939690 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-etc-openvswitch\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939726 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-systemd\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939672 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-run-netns\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939774 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-cni-netd\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939714 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-cni-netd\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939917 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-openvswitch\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.939964 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-node-log\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940078 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940117 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-systemd-units\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940140 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-var-lib-openvswitch\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940159 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940181 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-cni-bin\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940228 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-ovn\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940012 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-openvswitch\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940332 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940042 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-node-log\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940333 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-log-socket\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940400 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-cni-bin\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940390 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-systemd-units\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-var-lib-openvswitch\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940407 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-ovn\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.940251 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czbcx\" (UniqueName: \"kubernetes.io/projected/6837b168-c691-4e7e-a211-a0c8ef0534e2-kube-api-access-czbcx\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.952125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-env-overrides\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.954311 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovnkube-script-lib\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.957624 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovnkube-config\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:33 crc kubenswrapper[4778]: I1205 15:55:33.957842 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovn-node-metrics-cert\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.001870 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czbcx\" (UniqueName: \"kubernetes.io/projected/6837b168-c691-4e7e-a211-a0c8ef0534e2-kube-api-access-czbcx\") pod \"ovnkube-node-vzs5q\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.032465 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.052192 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 15:55:34 crc kubenswrapper[4778]: W1205 15:55:34.064570 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6837b168_c691_4e7e_a211_a0c8ef0534e2.slice/crio-4c630a0c8da361dcdfcb7d75caedfc1537c7980111735ecef181b7ec88456343 WatchSource:0}: Error finding container 4c630a0c8da361dcdfcb7d75caedfc1537c7980111735ecef181b7ec88456343: Status 404 returned error can't find the container with id 4c630a0c8da361dcdfcb7d75caedfc1537c7980111735ecef181b7ec88456343 Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.248657 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.248783 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.296761 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.298430 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.298477 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.298491 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.298610 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.306039 4778 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.306463 4778 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.307591 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.307619 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.307632 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.307649 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.307660 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:34Z","lastTransitionTime":"2025-12-05T15:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.328616 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.332540 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.332582 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.332591 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.332605 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.332616 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:34Z","lastTransitionTime":"2025-12-05T15:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.345539 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.348396 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.348431 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.348442 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.348458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.348469 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:34Z","lastTransitionTime":"2025-12-05T15:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.362509 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.365788 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.365844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.365854 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.365875 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.365887 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:34Z","lastTransitionTime":"2025-12-05T15:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.379456 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.387310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.387355 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.387386 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.387403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.387412 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:34Z","lastTransitionTime":"2025-12-05T15:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.390056 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.399442 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.399554 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.401436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.401463 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.401471 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.401485 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.401495 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:34Z","lastTransitionTime":"2025-12-05T15:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.404656 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.404713 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.404726 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"b8ff38ad3f1e50430c497785d31333f86e65dac880c3686bbb1e98ad0e13cc86"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.405958 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.407744 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xm5sq" event={"ID":"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57","Type":"ContainerStarted","Data":"11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.411015 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrqmz" event={"ID":"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4","Type":"ContainerStarted","Data":"c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.411044 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrqmz" event={"ID":"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4","Type":"ContainerStarted","Data":"930994e3c648eb2e0f00f7dac1c86b2db93d63478bbe1098001989924aac6047"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.412729 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507" exitCode=0 Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.412817 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.412862 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"4c630a0c8da361dcdfcb7d75caedfc1537c7980111735ecef181b7ec88456343"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.415678 4778 generic.go:334] "Generic (PLEG): container finished" podID="2c63eba1-fb5c-431f-beed-0e81832f7e21" containerID="2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52" exitCode=0 Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.415870 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" event={"ID":"2c63eba1-fb5c-431f-beed-0e81832f7e21","Type":"ContainerDied","Data":"2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.416051 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" event={"ID":"2c63eba1-fb5c-431f-beed-0e81832f7e21","Type":"ContainerStarted","Data":"364dd85463d061ffeae36839b37b675b76562f60c21f45b719f22ed9265eb334"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.459876 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.511964 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.512259 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.512330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.512415 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.512482 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:34Z","lastTransitionTime":"2025-12-05T15:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.518157 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.538054 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.553012 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.566749 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.578791 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.594623 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.608500 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.614774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.614811 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.614824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.614840 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.614852 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:34Z","lastTransitionTime":"2025-12-05T15:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.621999 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.632685 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.638133 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.645568 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.658767 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.670554 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.682384 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.699476 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.714295 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.715783 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.723246 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.723289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.723300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.723352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.723396 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:34Z","lastTransitionTime":"2025-12-05T15:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.734173 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.735887 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.749403 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.749474 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.749610 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.749614 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.749671 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:38.749654178 +0000 UTC m=+25.853450568 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.749704 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:38.749683189 +0000 UTC m=+25.853479569 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.750159 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.751487 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.768865 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.780042 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.791788 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.809204 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.826846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.827036 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.827099 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.827163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.827225 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:34Z","lastTransitionTime":"2025-12-05T15:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.830083 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.845145 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.849947 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.850134 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:55:38.850114784 +0000 UTC m=+25.953911164 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.850309 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.850512 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.850537 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.850550 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.850588 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:38.850580786 +0000 UTC m=+25.954377166 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.850515 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.850896 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.850977 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.851043 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:34 crc kubenswrapper[4778]: E1205 15:55:34.851174 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:38.851158492 +0000 UTC m=+25.954954872 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.861718 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.885079 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.907684 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.924796 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.929951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.930021 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.930041 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.930070 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.930089 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:34Z","lastTransitionTime":"2025-12-05T15:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.949100 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.963879 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.978066 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:34 crc kubenswrapper[4778]: I1205 15:55:34.994967 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:34Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.020558 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.033683 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.035001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.035051 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.035063 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.035084 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.035096 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:35Z","lastTransitionTime":"2025-12-05T15:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.063635 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.078128 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.090314 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.104524 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.117876 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.138197 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.138334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.138352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.138389 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.138403 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:35Z","lastTransitionTime":"2025-12-05T15:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.140353 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.241902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.242413 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.242429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.242453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.242468 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:35Z","lastTransitionTime":"2025-12-05T15:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.249043 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:35 crc kubenswrapper[4778]: E1205 15:55:35.249183 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.249516 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:35 crc kubenswrapper[4778]: E1205 15:55:35.249700 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.347339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.347401 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.347412 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.347433 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.347444 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:35Z","lastTransitionTime":"2025-12-05T15:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.425579 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.425630 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.425645 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.425666 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.425680 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.427741 4778 generic.go:334] "Generic (PLEG): container finished" podID="2c63eba1-fb5c-431f-beed-0e81832f7e21" containerID="1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8" exitCode=0 Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.428300 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" event={"ID":"2c63eba1-fb5c-431f-beed-0e81832f7e21","Type":"ContainerDied","Data":"1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.442457 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.450045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.450090 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.450101 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.450117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.450130 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:35Z","lastTransitionTime":"2025-12-05T15:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.462691 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.478303 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.500323 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.519478 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.536161 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.553289 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.560475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.560512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.560523 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.560539 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.560551 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:35Z","lastTransitionTime":"2025-12-05T15:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.568677 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.585928 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.605281 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.618787 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.631704 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.658806 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.663475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.663503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.663511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.663526 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.663537 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:35Z","lastTransitionTime":"2025-12-05T15:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.698393 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:35Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.766042 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.766090 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.766101 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.766116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.766127 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:35Z","lastTransitionTime":"2025-12-05T15:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.868462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.868509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.868518 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.868534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.868543 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:35Z","lastTransitionTime":"2025-12-05T15:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.971131 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.971179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.971191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.971208 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:35 crc kubenswrapper[4778]: I1205 15:55:35.971219 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:35Z","lastTransitionTime":"2025-12-05T15:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.073851 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.073921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.073939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.073967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.073986 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:36Z","lastTransitionTime":"2025-12-05T15:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.176579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.176623 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.176634 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.176651 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.176664 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:36Z","lastTransitionTime":"2025-12-05T15:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.249054 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:36 crc kubenswrapper[4778]: E1205 15:55:36.249209 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.280569 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.280630 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.280643 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.280665 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.280684 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:36Z","lastTransitionTime":"2025-12-05T15:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.360489 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-w67tn"] Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.361028 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w67tn" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.363593 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.365994 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.367446 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.368944 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.383804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.383841 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.383854 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.383872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.383883 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:36Z","lastTransitionTime":"2025-12-05T15:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.385526 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.400928 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.419829 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.435293 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.435622 4778 generic.go:334] "Generic (PLEG): container finished" podID="2c63eba1-fb5c-431f-beed-0e81832f7e21" containerID="24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2" exitCode=0 Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.435657 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" event={"ID":"2c63eba1-fb5c-431f-beed-0e81832f7e21","Type":"ContainerDied","Data":"24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2"} Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.442521 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910"} Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.452612 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.464715 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.467143 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2998e77d-eac9-4670-8527-5cdba406e819-host\") pod \"node-ca-w67tn\" (UID: \"2998e77d-eac9-4670-8527-5cdba406e819\") " pod="openshift-image-registry/node-ca-w67tn" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.467210 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2998e77d-eac9-4670-8527-5cdba406e819-serviceca\") pod \"node-ca-w67tn\" (UID: \"2998e77d-eac9-4670-8527-5cdba406e819\") " pod="openshift-image-registry/node-ca-w67tn" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.467311 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t9p5\" (UniqueName: \"kubernetes.io/projected/2998e77d-eac9-4670-8527-5cdba406e819-kube-api-access-9t9p5\") pod \"node-ca-w67tn\" (UID: \"2998e77d-eac9-4670-8527-5cdba406e819\") " pod="openshift-image-registry/node-ca-w67tn" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.486597 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.487270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.487314 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.487325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.487340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.487350 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:36Z","lastTransitionTime":"2025-12-05T15:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.514332 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.525904 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.548449 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.559255 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.567887 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2998e77d-eac9-4670-8527-5cdba406e819-host\") pod \"node-ca-w67tn\" (UID: \"2998e77d-eac9-4670-8527-5cdba406e819\") " pod="openshift-image-registry/node-ca-w67tn" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.567939 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2998e77d-eac9-4670-8527-5cdba406e819-serviceca\") pod \"node-ca-w67tn\" (UID: \"2998e77d-eac9-4670-8527-5cdba406e819\") " pod="openshift-image-registry/node-ca-w67tn" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.568047 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t9p5\" (UniqueName: \"kubernetes.io/projected/2998e77d-eac9-4670-8527-5cdba406e819-kube-api-access-9t9p5\") pod \"node-ca-w67tn\" (UID: \"2998e77d-eac9-4670-8527-5cdba406e819\") " pod="openshift-image-registry/node-ca-w67tn" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.569703 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2998e77d-eac9-4670-8527-5cdba406e819-host\") pod \"node-ca-w67tn\" (UID: \"2998e77d-eac9-4670-8527-5cdba406e819\") " pod="openshift-image-registry/node-ca-w67tn" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.576509 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.581266 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2998e77d-eac9-4670-8527-5cdba406e819-serviceca\") pod \"node-ca-w67tn\" (UID: \"2998e77d-eac9-4670-8527-5cdba406e819\") " pod="openshift-image-registry/node-ca-w67tn" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.591270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.591321 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.591331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.591352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.591384 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:36Z","lastTransitionTime":"2025-12-05T15:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.594579 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t9p5\" (UniqueName: \"kubernetes.io/projected/2998e77d-eac9-4670-8527-5cdba406e819-kube-api-access-9t9p5\") pod \"node-ca-w67tn\" (UID: \"2998e77d-eac9-4670-8527-5cdba406e819\") " pod="openshift-image-registry/node-ca-w67tn" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.600664 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.613204 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.624880 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.637934 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.651592 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.669190 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.684620 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w67tn" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.685163 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.693443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.693480 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.693492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.693509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.693521 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:36Z","lastTransitionTime":"2025-12-05T15:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.699823 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: W1205 15:55:36.704274 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2998e77d_eac9_4670_8527_5cdba406e819.slice/crio-45e91b27ac02cced846ba12d2b8a1f3454b96a94183bc4ea762abd8826cc48fc WatchSource:0}: Error finding container 45e91b27ac02cced846ba12d2b8a1f3454b96a94183bc4ea762abd8826cc48fc: Status 404 returned error can't find the container with id 45e91b27ac02cced846ba12d2b8a1f3454b96a94183bc4ea762abd8826cc48fc Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.710951 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.733396 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.744110 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.764482 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.796944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.796980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.796989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.797003 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.797013 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:36Z","lastTransitionTime":"2025-12-05T15:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.799560 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.839416 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.880246 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.899481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.899517 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.899526 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.899541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.899551 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:36Z","lastTransitionTime":"2025-12-05T15:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.917782 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:36 crc kubenswrapper[4778]: I1205 15:55:36.958800 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.000285 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.001836 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.001873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.001881 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.001896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.001907 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:37Z","lastTransitionTime":"2025-12-05T15:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.103423 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.103455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.103463 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.103475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.103483 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:37Z","lastTransitionTime":"2025-12-05T15:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.205987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.206031 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.206040 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.206055 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.206066 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:37Z","lastTransitionTime":"2025-12-05T15:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.248563 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:37 crc kubenswrapper[4778]: E1205 15:55:37.248745 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.248818 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:37 crc kubenswrapper[4778]: E1205 15:55:37.249007 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.309182 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.309252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.309271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.309289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.309302 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:37Z","lastTransitionTime":"2025-12-05T15:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.412121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.412194 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.412217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.412245 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.412264 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:37Z","lastTransitionTime":"2025-12-05T15:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.448108 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w67tn" event={"ID":"2998e77d-eac9-4670-8527-5cdba406e819","Type":"ContainerStarted","Data":"22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.448185 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w67tn" event={"ID":"2998e77d-eac9-4670-8527-5cdba406e819","Type":"ContainerStarted","Data":"45e91b27ac02cced846ba12d2b8a1f3454b96a94183bc4ea762abd8826cc48fc"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.453561 4778 generic.go:334] "Generic (PLEG): container finished" podID="2c63eba1-fb5c-431f-beed-0e81832f7e21" containerID="dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108" exitCode=0 Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.453627 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" event={"ID":"2c63eba1-fb5c-431f-beed-0e81832f7e21","Type":"ContainerDied","Data":"dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.466287 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.489895 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.511140 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.515496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.515552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.515566 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.515587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.515601 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:37Z","lastTransitionTime":"2025-12-05T15:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.528125 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.545917 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.562042 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.577774 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.621965 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.627581 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.627629 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.627640 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.627661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.627672 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:37Z","lastTransitionTime":"2025-12-05T15:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.643493 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.658782 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.684182 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.698138 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.711629 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.722478 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.730145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.730172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.730180 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.730194 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.730203 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:37Z","lastTransitionTime":"2025-12-05T15:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.737305 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.750385 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.761924 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.775695 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.786878 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.810118 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.832961 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.833038 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.833057 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.833086 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.833116 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:37Z","lastTransitionTime":"2025-12-05T15:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.841876 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.896170 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.919268 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.936194 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.936220 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.936230 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.936243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.936252 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:37Z","lastTransitionTime":"2025-12-05T15:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:37 crc kubenswrapper[4778]: I1205 15:55:37.960847 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:37Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.006278 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.040247 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.040320 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.040344 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.040413 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.040449 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:38Z","lastTransitionTime":"2025-12-05T15:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.049685 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.085791 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.129627 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.142855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.142924 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.142943 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.142973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.142994 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:38Z","lastTransitionTime":"2025-12-05T15:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.169985 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.206938 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.248779 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.249002 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.252499 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.252583 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.252613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.252641 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.252663 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:38Z","lastTransitionTime":"2025-12-05T15:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.355954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.355997 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.356010 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.356029 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.356040 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:38Z","lastTransitionTime":"2025-12-05T15:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.465837 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad"} Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.471044 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.471105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.471129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.471161 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.471188 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:38Z","lastTransitionTime":"2025-12-05T15:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.475009 4778 generic.go:334] "Generic (PLEG): container finished" podID="2c63eba1-fb5c-431f-beed-0e81832f7e21" containerID="099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1" exitCode=0 Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.475085 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" event={"ID":"2c63eba1-fb5c-431f-beed-0e81832f7e21","Type":"ContainerDied","Data":"099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1"} Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.502834 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.523067 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.548949 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.567084 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.574620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.574672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.574686 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.574710 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.574726 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:38Z","lastTransitionTime":"2025-12-05T15:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.587431 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.607910 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.626941 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.649028 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.664266 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.678763 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.679163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.679218 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.679232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.679257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.679280 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:38Z","lastTransitionTime":"2025-12-05T15:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.699486 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.710613 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.734331 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.763746 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.781835 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.781877 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.781890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.781907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.781921 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:38Z","lastTransitionTime":"2025-12-05T15:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.798081 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:38Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.799465 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.799519 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.799680 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.799745 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:46.799727773 +0000 UTC m=+33.903524163 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.799823 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.799868 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:46.799854957 +0000 UTC m=+33.903651337 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.885030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.885071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.885082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.885102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.885114 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:38Z","lastTransitionTime":"2025-12-05T15:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.900516 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.900868 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.900951 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.901061 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:55:46.90101773 +0000 UTC m=+34.004814150 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.901144 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.901168 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.901216 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.901235 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.901303 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:46.901281097 +0000 UTC m=+34.005077487 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.901181 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.901343 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:38 crc kubenswrapper[4778]: E1205 15:55:38.901464 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 15:55:46.901443442 +0000 UTC m=+34.005239832 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.989659 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.989725 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.989744 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.989769 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:38 crc kubenswrapper[4778]: I1205 15:55:38.989790 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:38Z","lastTransitionTime":"2025-12-05T15:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.093655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.093724 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.093746 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.093781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.093803 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:39Z","lastTransitionTime":"2025-12-05T15:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.196924 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.197016 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.197039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.197071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.197093 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:39Z","lastTransitionTime":"2025-12-05T15:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.249042 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:39 crc kubenswrapper[4778]: E1205 15:55:39.249327 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.250008 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:39 crc kubenswrapper[4778]: E1205 15:55:39.250171 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.300842 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.300908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.300922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.300946 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.300962 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:39Z","lastTransitionTime":"2025-12-05T15:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.404294 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.404416 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.404438 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.404461 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.404481 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:39Z","lastTransitionTime":"2025-12-05T15:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.487055 4778 generic.go:334] "Generic (PLEG): container finished" podID="2c63eba1-fb5c-431f-beed-0e81832f7e21" containerID="5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988" exitCode=0 Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.487168 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" event={"ID":"2c63eba1-fb5c-431f-beed-0e81832f7e21","Type":"ContainerDied","Data":"5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988"} Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.507603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.507905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.508105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.508292 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.508508 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:39Z","lastTransitionTime":"2025-12-05T15:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.512336 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.537335 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.558630 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.581974 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.613218 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.613273 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.613289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.613318 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.613337 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:39Z","lastTransitionTime":"2025-12-05T15:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.618846 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.637610 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.654196 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.686128 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.701860 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.716931 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.716976 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.716986 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.717003 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.717014 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:39Z","lastTransitionTime":"2025-12-05T15:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.718069 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.737869 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.752110 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.770158 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.787839 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.811450 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:39Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.820395 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.820455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.820468 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.820488 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.820500 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:39Z","lastTransitionTime":"2025-12-05T15:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.923529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.923583 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.923592 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.923608 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:39 crc kubenswrapper[4778]: I1205 15:55:39.923619 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:39Z","lastTransitionTime":"2025-12-05T15:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.026979 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.027089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.027110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.027141 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.027167 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:40Z","lastTransitionTime":"2025-12-05T15:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.129755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.129831 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.129849 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.129876 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.129893 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:40Z","lastTransitionTime":"2025-12-05T15:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.232997 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.233033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.233041 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.233056 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.233068 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:40Z","lastTransitionTime":"2025-12-05T15:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.248452 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:40 crc kubenswrapper[4778]: E1205 15:55:40.248558 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.336595 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.336671 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.336696 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.336729 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.336747 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:40Z","lastTransitionTime":"2025-12-05T15:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.440418 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.440482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.440502 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.440526 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.440547 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:40Z","lastTransitionTime":"2025-12-05T15:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.504647 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.505164 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.510991 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" event={"ID":"2c63eba1-fb5c-431f-beed-0e81832f7e21","Type":"ContainerStarted","Data":"4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.525556 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.540918 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.543494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.543529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.543539 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.543554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.543564 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:40Z","lastTransitionTime":"2025-12-05T15:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.544927 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.554833 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.566999 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.580179 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.595540 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.627783 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.639574 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.646069 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.646243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.646424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.646546 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.646631 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:40Z","lastTransitionTime":"2025-12-05T15:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.661399 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.681567 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.694177 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.711425 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.726235 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.745811 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.750159 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.750351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.750615 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.750776 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.750912 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:40Z","lastTransitionTime":"2025-12-05T15:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.771569 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.792855 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.812813 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.829766 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.849608 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.853980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.854024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.854040 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.854063 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.854081 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:40Z","lastTransitionTime":"2025-12-05T15:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.884923 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.902232 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.933582 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.950614 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.956152 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.956185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.956193 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.956206 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.956217 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:40Z","lastTransitionTime":"2025-12-05T15:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.969634 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:40 crc kubenswrapper[4778]: I1205 15:55:40.985256 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:40Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.004577 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.024166 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.043520 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.059044 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.059122 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.059146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.059176 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.059218 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:41Z","lastTransitionTime":"2025-12-05T15:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.062659 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.081075 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.163159 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.163239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.163261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.163290 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.163308 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:41Z","lastTransitionTime":"2025-12-05T15:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.249468 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.249491 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:41 crc kubenswrapper[4778]: E1205 15:55:41.249797 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:41 crc kubenswrapper[4778]: E1205 15:55:41.249652 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.266080 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.266160 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.266185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.266218 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.266241 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:41Z","lastTransitionTime":"2025-12-05T15:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.369655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.369727 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.369744 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.369769 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.369786 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:41Z","lastTransitionTime":"2025-12-05T15:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.411525 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.431690 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.455413 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.473955 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.474031 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.474124 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.474178 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.474275 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:41Z","lastTransitionTime":"2025-12-05T15:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.486666 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.511490 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.523461 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.523678 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.548004 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.577331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.577412 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.577433 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.577456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.577473 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:41Z","lastTransitionTime":"2025-12-05T15:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.609605 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.614264 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.635074 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.668177 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.680066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.680381 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.680490 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.680591 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.680680 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:41Z","lastTransitionTime":"2025-12-05T15:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.682311 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.705022 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.720397 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.737519 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.753443 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.766171 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.778737 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.783395 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.783445 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.783460 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.783482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.783499 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:41Z","lastTransitionTime":"2025-12-05T15:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.793929 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.813930 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.828075 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.838666 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.852619 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.873118 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.884389 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.885774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.885802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.885811 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.885824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.885833 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:41Z","lastTransitionTime":"2025-12-05T15:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.904280 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.915303 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.932749 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.946782 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.959121 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.972071 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.985778 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:41Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.988225 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.988273 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.988293 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.988315 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:41 crc kubenswrapper[4778]: I1205 15:55:41.988334 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:41Z","lastTransitionTime":"2025-12-05T15:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.016227 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:42Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.090443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.090488 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.090496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.090512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.090521 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:42Z","lastTransitionTime":"2025-12-05T15:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.193065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.193134 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.193153 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.193180 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.193204 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:42Z","lastTransitionTime":"2025-12-05T15:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.248873 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:42 crc kubenswrapper[4778]: E1205 15:55:42.249097 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.296807 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.296865 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.296882 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.296905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.296923 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:42Z","lastTransitionTime":"2025-12-05T15:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.400204 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.400247 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.400265 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.400289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.400307 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:42Z","lastTransitionTime":"2025-12-05T15:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.503086 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.503137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.503147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.503171 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.503183 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:42Z","lastTransitionTime":"2025-12-05T15:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.606464 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.606547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.606607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.606639 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.606663 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:42Z","lastTransitionTime":"2025-12-05T15:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.710480 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.710554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.710575 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.710600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.710621 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:42Z","lastTransitionTime":"2025-12-05T15:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.814252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.814306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.814324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.814350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.814400 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:42Z","lastTransitionTime":"2025-12-05T15:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.917409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.917474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.917513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.917537 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:42 crc kubenswrapper[4778]: I1205 15:55:42.917555 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:42Z","lastTransitionTime":"2025-12-05T15:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.020038 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.020129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.020161 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.020194 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.020218 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:43Z","lastTransitionTime":"2025-12-05T15:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.124161 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.124245 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.124263 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.124289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.124307 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:43Z","lastTransitionTime":"2025-12-05T15:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.227438 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.227492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.227507 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.227528 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.227543 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:43Z","lastTransitionTime":"2025-12-05T15:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.248930 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:43 crc kubenswrapper[4778]: E1205 15:55:43.249129 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.249334 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:43 crc kubenswrapper[4778]: E1205 15:55:43.249566 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.268961 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.292300 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.309473 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.327572 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.329211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.329259 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.329281 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.329308 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.329326 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:43Z","lastTransitionTime":"2025-12-05T15:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.359450 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.372122 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.387989 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.408358 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.420285 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.432507 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.432569 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.432583 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.432606 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.432621 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:43Z","lastTransitionTime":"2025-12-05T15:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.437133 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.452330 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.464486 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.478767 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.492512 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.508764 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.530972 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/0.log" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.533971 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d" exitCode=1 Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.534022 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d"} Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.534180 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.534203 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.534213 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.534226 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.534237 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:43Z","lastTransitionTime":"2025-12-05T15:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.536713 4778 scope.go:117] "RemoveContainer" containerID="a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.551237 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.572445 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.588824 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.602869 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.617039 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.630795 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.637708 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.637756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.637767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.637785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.637796 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:43Z","lastTransitionTime":"2025-12-05T15:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.647200 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.665508 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.689900 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.703312 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.737896 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:55:42Z\\\",\\\"message\\\":\\\"actory.go:160\\\\nI1205 15:55:42.199480 6073 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:55:42.199557 6073 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 15:55:42.199738 6073 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:42.199722 6073 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:42.199774 6073 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:42.199895 6073 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:55:42.200415 6073 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:42.200449 6073 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.741432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.741525 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.741543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.741571 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.741588 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:43Z","lastTransitionTime":"2025-12-05T15:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.762524 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.788531 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.810866 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.832777 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.845275 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.845394 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.845415 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.845445 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.845469 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:43Z","lastTransitionTime":"2025-12-05T15:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.949061 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.949133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.949155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.949185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:43 crc kubenswrapper[4778]: I1205 15:55:43.949209 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:43Z","lastTransitionTime":"2025-12-05T15:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.051638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.051679 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.051691 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.051707 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.051718 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.153819 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.154219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.154327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.154460 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.154581 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.249492 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:44 crc kubenswrapper[4778]: E1205 15:55:44.249685 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.257219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.257259 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.257272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.257289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.257301 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.359958 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.359996 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.360005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.360018 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.360027 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.461789 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.461870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.461894 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.461927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.461952 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.523749 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.523807 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.523824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.523847 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.523866 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.539518 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/0.log" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.542767 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.543653 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:44 crc kubenswrapper[4778]: E1205 15:55:44.546768 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.552280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.552350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.552401 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.552433 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.552453 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.561304 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: E1205 15:55:44.568647 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.572654 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.572753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.572783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.572822 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.572842 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.579195 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.592453 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: E1205 15:55:44.594193 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.599347 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.599409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.599419 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.599436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.599446 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.608627 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: E1205 15:55:44.616244 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.620254 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.620322 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.620342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.620453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.620501 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.624184 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: E1205 15:55:44.640633 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: E1205 15:55:44.640924 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.642927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.642973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.642985 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.643004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.643015 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.644272 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.656402 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.667940 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.690874 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.706024 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.721690 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.738297 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.746120 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.746168 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.746180 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.746199 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.746214 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.756001 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.766580 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.784888 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:55:42Z\\\",\\\"message\\\":\\\"actory.go:160\\\\nI1205 15:55:42.199480 6073 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:55:42.199557 6073 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 15:55:42.199738 6073 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:42.199722 6073 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:42.199774 6073 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:42.199895 6073 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:55:42.200415 6073 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:42.200449 6073 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:44Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.863506 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.863549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.863558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.863573 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.863582 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.966638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.966710 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.966727 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.966753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:44 crc kubenswrapper[4778]: I1205 15:55:44.966770 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:44Z","lastTransitionTime":"2025-12-05T15:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.069919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.069989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.070009 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.070036 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.070058 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:45Z","lastTransitionTime":"2025-12-05T15:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.173241 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.173334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.173357 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.173438 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.173463 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:45Z","lastTransitionTime":"2025-12-05T15:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.248744 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.248893 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:45 crc kubenswrapper[4778]: E1205 15:55:45.248917 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:45 crc kubenswrapper[4778]: E1205 15:55:45.249184 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.276391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.276457 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.276474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.276501 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.276522 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:45Z","lastTransitionTime":"2025-12-05T15:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.379099 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.379213 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.379233 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.379263 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.379283 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:45Z","lastTransitionTime":"2025-12-05T15:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.483410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.483570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.483660 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.483698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.483774 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:45Z","lastTransitionTime":"2025-12-05T15:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.550205 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/1.log" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.551756 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/0.log" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.556766 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99" exitCode=1 Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.556823 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99"} Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.556864 4778 scope.go:117] "RemoveContainer" containerID="a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.557597 4778 scope.go:117] "RemoveContainer" containerID="0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99" Dec 05 15:55:45 crc kubenswrapper[4778]: E1205 15:55:45.557797 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.578481 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.586971 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.587014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.587047 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.587064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.587137 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:45Z","lastTransitionTime":"2025-12-05T15:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.600870 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.620356 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.641890 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.677135 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.690043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.690137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.690163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.690193 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.690215 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:45Z","lastTransitionTime":"2025-12-05T15:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.694468 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.724733 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a35ecc957d096a8aa6e28dcd27d96ad929176feb47e3b67952536fd535ee4b8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:55:42Z\\\",\\\"message\\\":\\\"actory.go:160\\\\nI1205 15:55:42.199480 6073 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:55:42.199557 6073 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 15:55:42.199738 6073 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:42.199722 6073 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:42.199774 6073 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:42.199895 6073 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:55:42.200415 6073 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:42.200449 6073 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"60\\\\nI1205 15:55:44.623667 6200 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623757 6200 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623808 6200 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 15:55:44.623860 6200 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 15:55:44.623875 6200 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 15:55:44.623881 6200 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 15:55:44.623920 6200 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 15:55:44.624001 6200 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 15:55:44.624024 6200 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 15:55:44.624055 6200 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 15:55:44.624087 6200 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 15:55:44.624413 6200 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 15:55:44.624441 6200 factory.go:656] Stopping watch factory\\\\nI1205 15:55:44.624459 6200 ovnkube.go:599] Stopped ovnkube\\\\nI1205 15:55:44.624464 6200 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.742434 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.763571 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.785173 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.793043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.793090 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.793101 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.793117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.793132 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:45Z","lastTransitionTime":"2025-12-05T15:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.807815 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.829656 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.845906 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.861967 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.876604 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:45Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.895704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.895755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.895767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.895783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.895795 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:45Z","lastTransitionTime":"2025-12-05T15:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.998947 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.999019 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.999037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.999061 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:45 crc kubenswrapper[4778]: I1205 15:55:45.999081 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:45Z","lastTransitionTime":"2025-12-05T15:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.102345 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.102460 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.102478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.102504 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.102522 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:46Z","lastTransitionTime":"2025-12-05T15:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.205269 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.205351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.205410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.205446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.205472 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:46Z","lastTransitionTime":"2025-12-05T15:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.248996 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:46 crc kubenswrapper[4778]: E1205 15:55:46.249238 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.308578 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.308658 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.308677 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.308709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.308729 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:46Z","lastTransitionTime":"2025-12-05T15:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.411472 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.411608 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.411641 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.411677 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.411703 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:46Z","lastTransitionTime":"2025-12-05T15:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.514400 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.514464 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.514476 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.514493 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.514506 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:46Z","lastTransitionTime":"2025-12-05T15:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.567845 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/1.log" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.573959 4778 scope.go:117] "RemoveContainer" containerID="0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99" Dec 05 15:55:46 crc kubenswrapper[4778]: E1205 15:55:46.574270 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.594672 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv"] Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.595193 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.598764 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.598977 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.606958 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.617150 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.617198 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.617207 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.617224 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.617236 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:46Z","lastTransitionTime":"2025-12-05T15:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.626288 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.641188 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.661707 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.678269 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.695494 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9j5k\" (UniqueName: \"kubernetes.io/projected/4770dfb0-d6eb-436d-a657-3539f03c6e0e-kube-api-access-m9j5k\") pod \"ovnkube-control-plane-749d76644c-p9nwv\" (UID: \"4770dfb0-d6eb-436d-a657-3539f03c6e0e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.695626 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4770dfb0-d6eb-436d-a657-3539f03c6e0e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-p9nwv\" (UID: \"4770dfb0-d6eb-436d-a657-3539f03c6e0e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.695706 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4770dfb0-d6eb-436d-a657-3539f03c6e0e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-p9nwv\" (UID: \"4770dfb0-d6eb-436d-a657-3539f03c6e0e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.695751 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4770dfb0-d6eb-436d-a657-3539f03c6e0e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-p9nwv\" (UID: \"4770dfb0-d6eb-436d-a657-3539f03c6e0e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.697667 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.719304 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.719342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.719353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.719392 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.719405 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:46Z","lastTransitionTime":"2025-12-05T15:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.726811 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"60\\\\nI1205 15:55:44.623667 6200 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623757 6200 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623808 6200 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 15:55:44.623860 6200 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 15:55:44.623875 6200 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 15:55:44.623881 6200 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 15:55:44.623920 6200 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 15:55:44.624001 6200 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 15:55:44.624024 6200 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 15:55:44.624055 6200 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 15:55:44.624087 6200 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 15:55:44.624413 6200 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 15:55:44.624441 6200 factory.go:656] Stopping watch factory\\\\nI1205 15:55:44.624459 6200 ovnkube.go:599] Stopped ovnkube\\\\nI1205 15:55:44.624464 6200 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.741962 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.755729 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.795481 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.796325 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9j5k\" (UniqueName: \"kubernetes.io/projected/4770dfb0-d6eb-436d-a657-3539f03c6e0e-kube-api-access-m9j5k\") pod \"ovnkube-control-plane-749d76644c-p9nwv\" (UID: \"4770dfb0-d6eb-436d-a657-3539f03c6e0e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.796423 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4770dfb0-d6eb-436d-a657-3539f03c6e0e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-p9nwv\" (UID: \"4770dfb0-d6eb-436d-a657-3539f03c6e0e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.796478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4770dfb0-d6eb-436d-a657-3539f03c6e0e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-p9nwv\" (UID: \"4770dfb0-d6eb-436d-a657-3539f03c6e0e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.796505 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4770dfb0-d6eb-436d-a657-3539f03c6e0e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-p9nwv\" (UID: \"4770dfb0-d6eb-436d-a657-3539f03c6e0e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.797471 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4770dfb0-d6eb-436d-a657-3539f03c6e0e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-p9nwv\" (UID: \"4770dfb0-d6eb-436d-a657-3539f03c6e0e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.797546 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4770dfb0-d6eb-436d-a657-3539f03c6e0e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-p9nwv\" (UID: \"4770dfb0-d6eb-436d-a657-3539f03c6e0e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.806065 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4770dfb0-d6eb-436d-a657-3539f03c6e0e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-p9nwv\" (UID: \"4770dfb0-d6eb-436d-a657-3539f03c6e0e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.809642 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.817151 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9j5k\" (UniqueName: \"kubernetes.io/projected/4770dfb0-d6eb-436d-a657-3539f03c6e0e-kube-api-access-m9j5k\") pod \"ovnkube-control-plane-749d76644c-p9nwv\" (UID: \"4770dfb0-d6eb-436d-a657-3539f03c6e0e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.821804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.821848 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.821864 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.821886 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.821903 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:46Z","lastTransitionTime":"2025-12-05T15:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.827885 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.845026 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.865894 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.882920 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.897541 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.897644 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:46 crc kubenswrapper[4778]: E1205 15:55:46.897769 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:55:46 crc kubenswrapper[4778]: E1205 15:55:46.897843 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:55:46 crc kubenswrapper[4778]: E1205 15:55:46.897861 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:56:02.897841107 +0000 UTC m=+50.001637507 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:55:46 crc kubenswrapper[4778]: E1205 15:55:46.898018 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:56:02.897996722 +0000 UTC m=+50.001793112 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.899654 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.913841 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.926300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.926361 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.926407 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.926432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.926451 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:46Z","lastTransitionTime":"2025-12-05T15:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.927302 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.944847 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.951904 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.967507 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"60\\\\nI1205 15:55:44.623667 6200 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623757 6200 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623808 6200 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 15:55:44.623860 6200 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 15:55:44.623875 6200 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 15:55:44.623881 6200 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 15:55:44.623920 6200 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 15:55:44.624001 6200 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 15:55:44.624024 6200 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 15:55:44.624055 6200 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 15:55:44.624087 6200 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 15:55:44.624413 6200 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 15:55:44.624441 6200 factory.go:656] Stopping watch factory\\\\nI1205 15:55:44.624459 6200 ovnkube.go:599] Stopped ovnkube\\\\nI1205 15:55:44.624464 6200 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: W1205 15:55:46.972434 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4770dfb0_d6eb_436d_a657_3539f03c6e0e.slice/crio-2f8314a7bbd60c9ffa6d050bf41058ce3d3e728dba4f06fa10bbe380fcdd66bc WatchSource:0}: Error finding container 2f8314a7bbd60c9ffa6d050bf41058ce3d3e728dba4f06fa10bbe380fcdd66bc: Status 404 returned error can't find the container with id 2f8314a7bbd60c9ffa6d050bf41058ce3d3e728dba4f06fa10bbe380fcdd66bc Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.983603 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:46Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.998298 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.998511 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:46 crc kubenswrapper[4778]: I1205 15:55:46.998583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:46 crc kubenswrapper[4778]: E1205 15:55:46.998889 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:56:02.998862408 +0000 UTC m=+50.102658828 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:55:46 crc kubenswrapper[4778]: E1205 15:55:46.998943 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:55:46 crc kubenswrapper[4778]: E1205 15:55:46.999051 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:55:46 crc kubenswrapper[4778]: E1205 15:55:46.999113 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:46 crc kubenswrapper[4778]: E1205 15:55:46.999260 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 15:56:02.999221977 +0000 UTC m=+50.103018397 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:46 crc kubenswrapper[4778]: E1205 15:55:46.999906 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:55:47 crc kubenswrapper[4778]: E1205 15:55:47.000110 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:55:47 crc kubenswrapper[4778]: E1205 15:55:47.000294 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:47 crc kubenswrapper[4778]: E1205 15:55:47.000611 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 15:56:03.000570243 +0000 UTC m=+50.104366713 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.010689 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.029455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.029508 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.029534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.029566 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.029586 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:47Z","lastTransitionTime":"2025-12-05T15:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.036543 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.049151 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.069133 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.088652 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.104259 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.120697 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.132961 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.133005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.133015 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.133031 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.133042 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:47Z","lastTransitionTime":"2025-12-05T15:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.136083 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.148040 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.169126 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.234919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.234987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.235006 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.235031 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.235048 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:47Z","lastTransitionTime":"2025-12-05T15:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.249090 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.249112 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:47 crc kubenswrapper[4778]: E1205 15:55:47.249193 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:47 crc kubenswrapper[4778]: E1205 15:55:47.249401 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.337518 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8tvxd"] Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.338132 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:47 crc kubenswrapper[4778]: E1205 15:55:47.338208 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.339624 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.339679 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.339697 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.339723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.339736 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:47Z","lastTransitionTime":"2025-12-05T15:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.358212 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.375800 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.391189 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.412319 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.435467 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.443104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.443139 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.443148 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.443163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.443173 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:47Z","lastTransitionTime":"2025-12-05T15:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.456462 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.472744 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.492493 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.503649 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkwnx\" (UniqueName: \"kubernetes.io/projected/48cc0dd1-7387-4df1-aa6a-198ac40c620d-kube-api-access-xkwnx\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.503744 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.529199 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.544895 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.545735 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.545776 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.545788 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.545801 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.545812 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:47Z","lastTransitionTime":"2025-12-05T15:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.563532 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"60\\\\nI1205 15:55:44.623667 6200 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623757 6200 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623808 6200 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 15:55:44.623860 6200 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 15:55:44.623875 6200 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 15:55:44.623881 6200 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 15:55:44.623920 6200 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 15:55:44.624001 6200 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 15:55:44.624024 6200 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 15:55:44.624055 6200 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 15:55:44.624087 6200 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 15:55:44.624413 6200 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 15:55:44.624441 6200 factory.go:656] Stopping watch factory\\\\nI1205 15:55:44.624459 6200 ovnkube.go:599] Stopped ovnkube\\\\nI1205 15:55:44.624464 6200 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.575804 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.577467 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" event={"ID":"4770dfb0-d6eb-436d-a657-3539f03c6e0e","Type":"ContainerStarted","Data":"5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.577509 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" event={"ID":"4770dfb0-d6eb-436d-a657-3539f03c6e0e","Type":"ContainerStarted","Data":"a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.577520 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" event={"ID":"4770dfb0-d6eb-436d-a657-3539f03c6e0e","Type":"ContainerStarted","Data":"2f8314a7bbd60c9ffa6d050bf41058ce3d3e728dba4f06fa10bbe380fcdd66bc"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.596948 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.604729 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkwnx\" (UniqueName: \"kubernetes.io/projected/48cc0dd1-7387-4df1-aa6a-198ac40c620d-kube-api-access-xkwnx\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.604779 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:47 crc kubenswrapper[4778]: E1205 15:55:47.604952 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:55:47 crc kubenswrapper[4778]: E1205 15:55:47.605029 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs podName:48cc0dd1-7387-4df1-aa6a-198ac40c620d nodeName:}" failed. No retries permitted until 2025-12-05 15:55:48.105011339 +0000 UTC m=+35.208807729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs") pod "network-metrics-daemon-8tvxd" (UID: "48cc0dd1-7387-4df1-aa6a-198ac40c620d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.613727 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.622170 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkwnx\" (UniqueName: \"kubernetes.io/projected/48cc0dd1-7387-4df1-aa6a-198ac40c620d-kube-api-access-xkwnx\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.629334 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.643852 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.648433 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.648489 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.648501 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.648521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.648532 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:47Z","lastTransitionTime":"2025-12-05T15:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.657850 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.674699 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.690599 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.701759 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.721017 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.735331 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.751304 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.751378 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.751390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.751411 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.751427 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:47Z","lastTransitionTime":"2025-12-05T15:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.762209 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.786170 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.799525 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.825099 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"60\\\\nI1205 15:55:44.623667 6200 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623757 6200 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623808 6200 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 15:55:44.623860 6200 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 15:55:44.623875 6200 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 15:55:44.623881 6200 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 15:55:44.623920 6200 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 15:55:44.624001 6200 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 15:55:44.624024 6200 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 15:55:44.624055 6200 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 15:55:44.624087 6200 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 15:55:44.624413 6200 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 15:55:44.624441 6200 factory.go:656] Stopping watch factory\\\\nI1205 15:55:44.624459 6200 ovnkube.go:599] Stopped ovnkube\\\\nI1205 15:55:44.624464 6200 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.839278 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.854582 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.854772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.855074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.855227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.855353 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:47Z","lastTransitionTime":"2025-12-05T15:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.857524 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.881706 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.900780 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.918448 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.934893 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.959149 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.959185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.959199 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.959222 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.959242 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:47Z","lastTransitionTime":"2025-12-05T15:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:47 crc kubenswrapper[4778]: I1205 15:55:47.998600 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:47Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.019639 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:48Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.062863 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.062923 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.062936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.062958 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.062972 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:48Z","lastTransitionTime":"2025-12-05T15:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.109135 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:48 crc kubenswrapper[4778]: E1205 15:55:48.109323 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:55:48 crc kubenswrapper[4778]: E1205 15:55:48.109447 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs podName:48cc0dd1-7387-4df1-aa6a-198ac40c620d nodeName:}" failed. No retries permitted until 2025-12-05 15:55:49.109422321 +0000 UTC m=+36.213218731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs") pod "network-metrics-daemon-8tvxd" (UID: "48cc0dd1-7387-4df1-aa6a-198ac40c620d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.166442 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.166485 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.166496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.166514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.166526 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:48Z","lastTransitionTime":"2025-12-05T15:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.249097 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:48 crc kubenswrapper[4778]: E1205 15:55:48.249325 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.269494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.269544 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.269564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.269589 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.269607 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:48Z","lastTransitionTime":"2025-12-05T15:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.372790 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.372852 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.372869 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.372892 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.372911 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:48Z","lastTransitionTime":"2025-12-05T15:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.475964 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.476044 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.476071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.476104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.476127 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:48Z","lastTransitionTime":"2025-12-05T15:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.580779 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.580839 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.580854 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.580876 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.580892 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:48Z","lastTransitionTime":"2025-12-05T15:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.683798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.683834 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.683844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.683859 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.683870 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:48Z","lastTransitionTime":"2025-12-05T15:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.786563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.786648 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.786668 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.786694 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.786711 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:48Z","lastTransitionTime":"2025-12-05T15:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.890133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.890200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.890217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.890244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.890263 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:48Z","lastTransitionTime":"2025-12-05T15:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.992936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.993009 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.993027 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.993054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:48 crc kubenswrapper[4778]: I1205 15:55:48.993074 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:48Z","lastTransitionTime":"2025-12-05T15:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.095524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.095585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.095603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.095628 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.095647 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:49Z","lastTransitionTime":"2025-12-05T15:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.120888 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:49 crc kubenswrapper[4778]: E1205 15:55:49.121170 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:55:49 crc kubenswrapper[4778]: E1205 15:55:49.121286 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs podName:48cc0dd1-7387-4df1-aa6a-198ac40c620d nodeName:}" failed. No retries permitted until 2025-12-05 15:55:51.121255815 +0000 UTC m=+38.225052235 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs") pod "network-metrics-daemon-8tvxd" (UID: "48cc0dd1-7387-4df1-aa6a-198ac40c620d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.198650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.198713 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.198731 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.198755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.198774 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:49Z","lastTransitionTime":"2025-12-05T15:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.249395 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.249461 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:49 crc kubenswrapper[4778]: E1205 15:55:49.249519 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.249396 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:49 crc kubenswrapper[4778]: E1205 15:55:49.249614 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:49 crc kubenswrapper[4778]: E1205 15:55:49.249735 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.302137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.302183 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.302228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.302250 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.302263 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:49Z","lastTransitionTime":"2025-12-05T15:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.405211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.405285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.405310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.405339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.405402 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:49Z","lastTransitionTime":"2025-12-05T15:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.508757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.508824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.508842 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.508865 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.508883 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:49Z","lastTransitionTime":"2025-12-05T15:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.612035 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.612099 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.612120 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.612146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.612171 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:49Z","lastTransitionTime":"2025-12-05T15:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.715306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.715353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.715387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.715403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.715414 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:49Z","lastTransitionTime":"2025-12-05T15:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.818455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.818512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.818524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.818543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.818558 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:49Z","lastTransitionTime":"2025-12-05T15:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.921809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.921888 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.921906 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.921933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:49 crc kubenswrapper[4778]: I1205 15:55:49.921952 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:49Z","lastTransitionTime":"2025-12-05T15:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.025563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.025628 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.025649 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.025674 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.025696 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:50Z","lastTransitionTime":"2025-12-05T15:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.128343 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.128417 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.128433 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.128458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.128474 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:50Z","lastTransitionTime":"2025-12-05T15:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.231772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.231843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.231861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.231886 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.231904 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:50Z","lastTransitionTime":"2025-12-05T15:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.248469 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:50 crc kubenswrapper[4778]: E1205 15:55:50.248706 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.334944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.335013 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.335030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.335056 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.335073 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:50Z","lastTransitionTime":"2025-12-05T15:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.438900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.438964 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.438981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.439004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.439020 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:50Z","lastTransitionTime":"2025-12-05T15:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.542334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.542427 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.542445 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.542468 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.542485 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:50Z","lastTransitionTime":"2025-12-05T15:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.645992 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.646065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.646100 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.646164 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.646188 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:50Z","lastTransitionTime":"2025-12-05T15:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.749162 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.749213 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.749227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.749247 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.749261 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:50Z","lastTransitionTime":"2025-12-05T15:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.851850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.851889 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.851901 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.851917 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.851930 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:50Z","lastTransitionTime":"2025-12-05T15:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.954908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.954970 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.954983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.955001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:50 crc kubenswrapper[4778]: I1205 15:55:50.955014 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:50Z","lastTransitionTime":"2025-12-05T15:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.058243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.058285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.058301 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.058322 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.058338 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:51Z","lastTransitionTime":"2025-12-05T15:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.141277 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:51 crc kubenswrapper[4778]: E1205 15:55:51.141460 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:55:51 crc kubenswrapper[4778]: E1205 15:55:51.141900 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs podName:48cc0dd1-7387-4df1-aa6a-198ac40c620d nodeName:}" failed. No retries permitted until 2025-12-05 15:55:55.141870694 +0000 UTC m=+42.245667104 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs") pod "network-metrics-daemon-8tvxd" (UID: "48cc0dd1-7387-4df1-aa6a-198ac40c620d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.160907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.160991 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.161005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.161024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.161038 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:51Z","lastTransitionTime":"2025-12-05T15:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.248840 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.248954 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.249290 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:51 crc kubenswrapper[4778]: E1205 15:55:51.249549 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:51 crc kubenswrapper[4778]: E1205 15:55:51.249767 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:55:51 crc kubenswrapper[4778]: E1205 15:55:51.250010 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.263009 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.263066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.263082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.263102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.263116 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:51Z","lastTransitionTime":"2025-12-05T15:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.365456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.365511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.365529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.365555 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.365573 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:51Z","lastTransitionTime":"2025-12-05T15:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.468079 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.468125 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.468136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.468152 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.468164 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:51Z","lastTransitionTime":"2025-12-05T15:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.570719 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.570772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.570788 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.570810 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.570826 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:51Z","lastTransitionTime":"2025-12-05T15:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.673004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.673112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.673136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.673165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.673188 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:51Z","lastTransitionTime":"2025-12-05T15:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.776156 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.776200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.776211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.776227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.776246 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:51Z","lastTransitionTime":"2025-12-05T15:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.880055 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.880117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.880130 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.880148 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.880161 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:51Z","lastTransitionTime":"2025-12-05T15:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.983079 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.983177 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.983203 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.983234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:51 crc kubenswrapper[4778]: I1205 15:55:51.983258 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:51Z","lastTransitionTime":"2025-12-05T15:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.086128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.086188 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.086203 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.086225 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.086243 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:52Z","lastTransitionTime":"2025-12-05T15:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.189526 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.189579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.189591 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.189611 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.189626 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:52Z","lastTransitionTime":"2025-12-05T15:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.249152 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:52 crc kubenswrapper[4778]: E1205 15:55:52.249349 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.293108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.293187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.293245 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.293271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.293300 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:52Z","lastTransitionTime":"2025-12-05T15:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.396200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.396243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.396252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.396266 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.396275 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:52Z","lastTransitionTime":"2025-12-05T15:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.498695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.498750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.498759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.498775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.498791 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:52Z","lastTransitionTime":"2025-12-05T15:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.601250 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.601403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.601417 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.601434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.601445 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:52Z","lastTransitionTime":"2025-12-05T15:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.703775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.703815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.703824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.703842 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.703851 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:52Z","lastTransitionTime":"2025-12-05T15:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.806448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.806492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.806501 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.806514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.806525 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:52Z","lastTransitionTime":"2025-12-05T15:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.908814 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.908851 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.908860 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.908873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:52 crc kubenswrapper[4778]: I1205 15:55:52.908883 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:52Z","lastTransitionTime":"2025-12-05T15:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.011844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.011926 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.011945 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.011970 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.011988 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:53Z","lastTransitionTime":"2025-12-05T15:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.115658 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.115722 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.115739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.115765 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.115783 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:53Z","lastTransitionTime":"2025-12-05T15:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.219439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.219512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.219529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.219554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.219571 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:53Z","lastTransitionTime":"2025-12-05T15:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.248789 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.248922 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:53 crc kubenswrapper[4778]: E1205 15:55:53.249005 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:53 crc kubenswrapper[4778]: E1205 15:55:53.249101 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.249431 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:53 crc kubenswrapper[4778]: E1205 15:55:53.249636 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.267532 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.287719 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.306732 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.321383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.321426 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.321440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.321458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.321500 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:53Z","lastTransitionTime":"2025-12-05T15:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.325718 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.344507 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.360474 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.376595 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.390550 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.409614 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.424176 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.424253 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.424275 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.424303 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.424322 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:53Z","lastTransitionTime":"2025-12-05T15:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.429307 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.446008 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.459508 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.476184 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.501869 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.518398 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.527276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.527318 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.527332 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.527353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.527384 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:53Z","lastTransitionTime":"2025-12-05T15:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.548962 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"60\\\\nI1205 15:55:44.623667 6200 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623757 6200 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623808 6200 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 15:55:44.623860 6200 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 15:55:44.623875 6200 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 15:55:44.623881 6200 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 15:55:44.623920 6200 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 15:55:44.624001 6200 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 15:55:44.624024 6200 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 15:55:44.624055 6200 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 15:55:44.624087 6200 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 15:55:44.624413 6200 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 15:55:44.624441 6200 factory.go:656] Stopping watch factory\\\\nI1205 15:55:44.624459 6200 ovnkube.go:599] Stopped ovnkube\\\\nI1205 15:55:44.624464 6200 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.565954 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:53Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.630929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.631002 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.631020 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.631098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.631152 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:53Z","lastTransitionTime":"2025-12-05T15:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.734356 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.734441 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.734461 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.734486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.734505 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:53Z","lastTransitionTime":"2025-12-05T15:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.837285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.837345 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.837353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.837387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.837397 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:53Z","lastTransitionTime":"2025-12-05T15:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.940203 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.940292 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.940310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.940335 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:53 crc kubenswrapper[4778]: I1205 15:55:53.940356 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:53Z","lastTransitionTime":"2025-12-05T15:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.043248 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.043315 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.043332 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.043355 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.043412 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:54Z","lastTransitionTime":"2025-12-05T15:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.147057 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.147132 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.147151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.147174 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.147190 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:54Z","lastTransitionTime":"2025-12-05T15:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.248729 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:54 crc kubenswrapper[4778]: E1205 15:55:54.249028 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.250024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.250083 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.250095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.250123 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.250141 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:54Z","lastTransitionTime":"2025-12-05T15:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.353443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.353516 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.353540 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.353570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.353595 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:54Z","lastTransitionTime":"2025-12-05T15:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.456970 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.457040 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.457093 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.457127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.457149 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:54Z","lastTransitionTime":"2025-12-05T15:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.560624 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.560713 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.560741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.560774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.560794 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:54Z","lastTransitionTime":"2025-12-05T15:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.663601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.663661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.663681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.663707 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.663730 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:54Z","lastTransitionTime":"2025-12-05T15:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.766822 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.766900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.766919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.766945 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.766968 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:54Z","lastTransitionTime":"2025-12-05T15:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.871699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.871774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.871791 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.871825 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.871848 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:54Z","lastTransitionTime":"2025-12-05T15:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.963161 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.963223 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.963238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.963258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.963275 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:54Z","lastTransitionTime":"2025-12-05T15:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:54 crc kubenswrapper[4778]: E1205 15:55:54.982856 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:54Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.988736 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.988812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.988836 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.988871 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:54 crc kubenswrapper[4778]: I1205 15:55:54.988894 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:54Z","lastTransitionTime":"2025-12-05T15:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:55 crc kubenswrapper[4778]: E1205 15:55:55.007295 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:55Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.012461 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.012511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.012531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.012551 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.012566 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:55Z","lastTransitionTime":"2025-12-05T15:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:55 crc kubenswrapper[4778]: E1205 15:55:55.030354 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:55Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.035882 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.035976 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.036007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.036041 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.036061 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:55Z","lastTransitionTime":"2025-12-05T15:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:55 crc kubenswrapper[4778]: E1205 15:55:55.054482 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:55Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.059729 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.059810 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.059830 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.059858 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.059880 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:55Z","lastTransitionTime":"2025-12-05T15:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:55 crc kubenswrapper[4778]: E1205 15:55:55.078470 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:55Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:55 crc kubenswrapper[4778]: E1205 15:55:55.078756 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.080861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.080945 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.080971 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.081001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.081028 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:55Z","lastTransitionTime":"2025-12-05T15:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.184454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.184513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.184529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.184552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.184572 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:55Z","lastTransitionTime":"2025-12-05T15:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.195424 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:55 crc kubenswrapper[4778]: E1205 15:55:55.195581 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:55:55 crc kubenswrapper[4778]: E1205 15:55:55.195648 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs podName:48cc0dd1-7387-4df1-aa6a-198ac40c620d nodeName:}" failed. No retries permitted until 2025-12-05 15:56:03.195630618 +0000 UTC m=+50.299427008 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs") pod "network-metrics-daemon-8tvxd" (UID: "48cc0dd1-7387-4df1-aa6a-198ac40c620d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.249089 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.249216 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:55 crc kubenswrapper[4778]: E1205 15:55:55.249304 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.249360 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:55 crc kubenswrapper[4778]: E1205 15:55:55.249562 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:55 crc kubenswrapper[4778]: E1205 15:55:55.249662 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.290841 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.290899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.290909 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.290925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.290937 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:55Z","lastTransitionTime":"2025-12-05T15:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.394561 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.394615 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.394627 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.394650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.394666 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:55Z","lastTransitionTime":"2025-12-05T15:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.497759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.497840 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.497856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.497880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.497895 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:55Z","lastTransitionTime":"2025-12-05T15:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.601353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.601475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.601499 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.601529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.601552 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:55Z","lastTransitionTime":"2025-12-05T15:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.704538 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.704602 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.704628 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.704660 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.704682 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:55Z","lastTransitionTime":"2025-12-05T15:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.807829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.807898 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.807915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.807938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.807958 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:55Z","lastTransitionTime":"2025-12-05T15:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.911215 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.911264 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.911289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.911310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:55 crc kubenswrapper[4778]: I1205 15:55:55.911325 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:55Z","lastTransitionTime":"2025-12-05T15:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.015691 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.015759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.015777 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.015802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.015819 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:56Z","lastTransitionTime":"2025-12-05T15:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.119420 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.119504 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.119525 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.119554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.119573 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:56Z","lastTransitionTime":"2025-12-05T15:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.222170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.222239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.222261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.222327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.222345 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:56Z","lastTransitionTime":"2025-12-05T15:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.248470 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:56 crc kubenswrapper[4778]: E1205 15:55:56.248633 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.325862 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.325954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.325974 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.326002 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.326022 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:56Z","lastTransitionTime":"2025-12-05T15:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.429188 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.429288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.429312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.429341 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.429359 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:56Z","lastTransitionTime":"2025-12-05T15:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.532172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.532308 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.532332 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.532358 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.532421 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:56Z","lastTransitionTime":"2025-12-05T15:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.635528 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.635590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.635607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.635631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.635659 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:56Z","lastTransitionTime":"2025-12-05T15:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.739034 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.739116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.739141 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.739173 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.739195 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:56Z","lastTransitionTime":"2025-12-05T15:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.841798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.841884 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.841907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.841940 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.841966 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:56Z","lastTransitionTime":"2025-12-05T15:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.945189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.945266 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.945290 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.945327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:56 crc kubenswrapper[4778]: I1205 15:55:56.945349 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:56Z","lastTransitionTime":"2025-12-05T15:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.049424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.049481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.049497 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.049520 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.049540 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:57Z","lastTransitionTime":"2025-12-05T15:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.153055 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.153150 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.153178 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.153213 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.153237 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:57Z","lastTransitionTime":"2025-12-05T15:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.249134 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.249230 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.249167 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:57 crc kubenswrapper[4778]: E1205 15:55:57.249361 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:57 crc kubenswrapper[4778]: E1205 15:55:57.249548 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:57 crc kubenswrapper[4778]: E1205 15:55:57.249727 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.257409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.257487 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.257512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.257538 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.257556 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:57Z","lastTransitionTime":"2025-12-05T15:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.360179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.360234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.360244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.360264 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.360274 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:57Z","lastTransitionTime":"2025-12-05T15:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.463693 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.463750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.463772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.463797 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.463819 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:57Z","lastTransitionTime":"2025-12-05T15:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.566883 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.566973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.566999 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.567028 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.567054 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:57Z","lastTransitionTime":"2025-12-05T15:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.670208 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.670292 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.670321 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.670353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.670412 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:57Z","lastTransitionTime":"2025-12-05T15:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.774003 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.774062 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.774074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.774097 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.774110 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:57Z","lastTransitionTime":"2025-12-05T15:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.877611 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.877700 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.877719 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.877745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.877765 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:57Z","lastTransitionTime":"2025-12-05T15:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.980427 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.980504 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.980537 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.980566 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:57 crc kubenswrapper[4778]: I1205 15:55:57.980589 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:57Z","lastTransitionTime":"2025-12-05T15:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.083220 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.083285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.083307 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.083337 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.083361 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:58Z","lastTransitionTime":"2025-12-05T15:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.186866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.186978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.186995 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.187021 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.187043 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:58Z","lastTransitionTime":"2025-12-05T15:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.249070 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:55:58 crc kubenswrapper[4778]: E1205 15:55:58.249271 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.290824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.290893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.290914 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.290980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.291002 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:58Z","lastTransitionTime":"2025-12-05T15:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.395105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.395176 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.395201 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.395233 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.395256 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:58Z","lastTransitionTime":"2025-12-05T15:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.498194 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.498296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.498316 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.498356 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.498411 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:58Z","lastTransitionTime":"2025-12-05T15:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.600888 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.600930 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.600939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.600955 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.600965 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:58Z","lastTransitionTime":"2025-12-05T15:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.704179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.704223 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.704232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.704247 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.704259 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:58Z","lastTransitionTime":"2025-12-05T15:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.807822 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.807860 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.807870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.807886 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.807895 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:58Z","lastTransitionTime":"2025-12-05T15:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.910113 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.910171 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.910184 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.910203 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:58 crc kubenswrapper[4778]: I1205 15:55:58.910218 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:58Z","lastTransitionTime":"2025-12-05T15:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.013709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.013779 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.013799 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.013824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.013842 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:59Z","lastTransitionTime":"2025-12-05T15:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.116988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.117050 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.117066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.117093 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.117110 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:59Z","lastTransitionTime":"2025-12-05T15:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.221144 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.221203 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.221214 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.221234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.221253 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:59Z","lastTransitionTime":"2025-12-05T15:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.249026 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.249050 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.249243 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:55:59 crc kubenswrapper[4778]: E1205 15:55:59.249758 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:55:59 crc kubenswrapper[4778]: E1205 15:55:59.249916 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.249999 4778 scope.go:117] "RemoveContainer" containerID="0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99" Dec 05 15:55:59 crc kubenswrapper[4778]: E1205 15:55:59.250119 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.325455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.326044 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.326064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.326169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.326234 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:59Z","lastTransitionTime":"2025-12-05T15:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.430463 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.430537 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.430560 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.430589 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.430609 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:59Z","lastTransitionTime":"2025-12-05T15:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.533656 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.533707 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.533719 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.533739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.533753 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:59Z","lastTransitionTime":"2025-12-05T15:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.620736 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/1.log" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.623598 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022"} Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.624274 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.643877 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.644032 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.644086 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.644100 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.644122 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.644135 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:59Z","lastTransitionTime":"2025-12-05T15:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.661519 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.676580 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.692293 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.712588 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.724934 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.747564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.747634 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.747649 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.747673 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.747690 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:59Z","lastTransitionTime":"2025-12-05T15:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.749963 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"60\\\\nI1205 15:55:44.623667 6200 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623757 6200 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623808 6200 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 15:55:44.623860 6200 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 15:55:44.623875 6200 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 15:55:44.623881 6200 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 15:55:44.623920 6200 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 15:55:44.624001 6200 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 15:55:44.624024 6200 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 15:55:44.624055 6200 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 15:55:44.624087 6200 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 15:55:44.624413 6200 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 15:55:44.624441 6200 factory.go:656] Stopping watch factory\\\\nI1205 15:55:44.624459 6200 ovnkube.go:599] Stopped ovnkube\\\\nI1205 15:55:44.624464 6200 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.768860 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.793517 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.812562 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.826907 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.841316 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.850506 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.850555 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.850564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.850578 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.850588 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:59Z","lastTransitionTime":"2025-12-05T15:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.862906 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.879492 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.894918 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.911798 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.928069 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:55:59Z is after 2025-08-24T17:21:41Z" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.953214 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.953262 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.953273 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.953291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:55:59 crc kubenswrapper[4778]: I1205 15:55:59.953305 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:55:59Z","lastTransitionTime":"2025-12-05T15:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.056359 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.056429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.056444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.056466 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.056478 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:00Z","lastTransitionTime":"2025-12-05T15:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.160186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.160276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.160303 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.160342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.160409 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:00Z","lastTransitionTime":"2025-12-05T15:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.249188 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:00 crc kubenswrapper[4778]: E1205 15:56:00.249428 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.263675 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.263765 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.263790 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.263868 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.263976 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:00Z","lastTransitionTime":"2025-12-05T15:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.366581 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.366655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.366725 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.366749 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.366787 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:00Z","lastTransitionTime":"2025-12-05T15:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.470276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.470351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.470384 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.470404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.470418 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:00Z","lastTransitionTime":"2025-12-05T15:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.574425 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.574494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.574513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.574538 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.574556 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:00Z","lastTransitionTime":"2025-12-05T15:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.628983 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/2.log" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.629936 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/1.log" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.633492 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022" exitCode=1 Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.633553 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022"} Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.633603 4778 scope.go:117] "RemoveContainer" containerID="0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.634624 4778 scope.go:117] "RemoveContainer" containerID="31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022" Dec 05 15:56:00 crc kubenswrapper[4778]: E1205 15:56:00.634899 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.662076 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.678851 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.678944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.678970 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.679005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.679029 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:00Z","lastTransitionTime":"2025-12-05T15:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.685174 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.703039 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.720734 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.739929 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.773840 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.783713 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.783805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.783825 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.783847 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.783863 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:00Z","lastTransitionTime":"2025-12-05T15:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.791751 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.825046 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0646064f1976d7dd465338757cef914c4e54c0dbcb731519691d9c3bd6074e99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:55:44Z\\\",\\\"message\\\":\\\"60\\\\nI1205 15:55:44.623667 6200 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623757 6200 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:55:44.623808 6200 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 15:55:44.623860 6200 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 15:55:44.623875 6200 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 15:55:44.623881 6200 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 15:55:44.623920 6200 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 15:55:44.624001 6200 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 15:55:44.624024 6200 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 15:55:44.624055 6200 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 15:55:44.624087 6200 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 15:55:44.624413 6200 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 15:55:44.624441 6200 factory.go:656] Stopping watch factory\\\\nI1205 15:55:44.624459 6200 ovnkube.go:599] Stopped ovnkube\\\\nI1205 15:55:44.624464 6200 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:00Z\\\",\\\"message\\\":\\\"opping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392006 6404 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392189 6404 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392396 6404 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392444 6404 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392485 6404 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392615 6404 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392634 6404 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.843295 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.861200 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.882362 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.887332 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.887436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.887455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.887483 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.887500 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:00Z","lastTransitionTime":"2025-12-05T15:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.903702 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.923262 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.941618 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.962163 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.985894 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:00Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.990233 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.990309 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.990327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.990353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:00 crc kubenswrapper[4778]: I1205 15:56:00.990407 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:00Z","lastTransitionTime":"2025-12-05T15:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.005125 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.093459 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.093535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.093552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.093577 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.093595 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:01Z","lastTransitionTime":"2025-12-05T15:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.196576 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.196644 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.196661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.196688 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.196707 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:01Z","lastTransitionTime":"2025-12-05T15:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.249600 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.249664 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.249737 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:01 crc kubenswrapper[4778]: E1205 15:56:01.249881 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:01 crc kubenswrapper[4778]: E1205 15:56:01.250058 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:01 crc kubenswrapper[4778]: E1205 15:56:01.250254 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.299801 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.299858 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.299876 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.299899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.299916 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:01Z","lastTransitionTime":"2025-12-05T15:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.404441 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.404520 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.404543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.404572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.404628 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:01Z","lastTransitionTime":"2025-12-05T15:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.508866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.508967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.508989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.509016 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.509034 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:01Z","lastTransitionTime":"2025-12-05T15:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.612239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.612310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.612329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.612360 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.612432 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:01Z","lastTransitionTime":"2025-12-05T15:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.640925 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/2.log" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.646478 4778 scope.go:117] "RemoveContainer" containerID="31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022" Dec 05 15:56:01 crc kubenswrapper[4778]: E1205 15:56:01.646779 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.669792 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.683142 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.700548 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.715502 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.716478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.716538 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.716558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.716586 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.716605 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:01Z","lastTransitionTime":"2025-12-05T15:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.729595 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.743910 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.769088 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.787510 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.812048 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.820605 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.820662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.820687 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.820719 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.820742 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:01Z","lastTransitionTime":"2025-12-05T15:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.834844 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.855670 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.872510 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.894282 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.923915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.923982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.924002 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.924026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.924045 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:01Z","lastTransitionTime":"2025-12-05T15:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.931456 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.948040 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.977491 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:00Z\\\",\\\"message\\\":\\\"opping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392006 6404 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392189 6404 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392396 6404 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392444 6404 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392485 6404 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392615 6404 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392634 6404 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:01 crc kubenswrapper[4778]: I1205 15:56:01.993257 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:01Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.027116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.027194 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.027230 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.027261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.027283 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:02Z","lastTransitionTime":"2025-12-05T15:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.130443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.130525 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.130549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.130580 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.130603 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:02Z","lastTransitionTime":"2025-12-05T15:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.233938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.233999 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.234020 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.234058 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.234082 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:02Z","lastTransitionTime":"2025-12-05T15:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.249494 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:02 crc kubenswrapper[4778]: E1205 15:56:02.249630 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.337951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.338028 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.338052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.338082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.338107 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:02Z","lastTransitionTime":"2025-12-05T15:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.441290 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.441416 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.441431 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.441452 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.441465 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:02Z","lastTransitionTime":"2025-12-05T15:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.544643 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.544703 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.544723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.544745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.544763 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:02Z","lastTransitionTime":"2025-12-05T15:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.650304 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.650399 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.650438 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.650467 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.650487 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:02Z","lastTransitionTime":"2025-12-05T15:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.754784 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.754846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.754864 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.754889 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.754907 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:02Z","lastTransitionTime":"2025-12-05T15:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.857217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.857279 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.857297 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.857324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.857342 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:02Z","lastTransitionTime":"2025-12-05T15:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.960226 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.960279 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.960295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.960318 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.960334 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:02Z","lastTransitionTime":"2025-12-05T15:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.988137 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:02 crc kubenswrapper[4778]: I1205 15:56:02.988252 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:02 crc kubenswrapper[4778]: E1205 15:56:02.988392 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:56:02 crc kubenswrapper[4778]: E1205 15:56:02.988425 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:56:02 crc kubenswrapper[4778]: E1205 15:56:02.988531 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:56:34.988502113 +0000 UTC m=+82.092298533 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:56:02 crc kubenswrapper[4778]: E1205 15:56:02.988603 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:56:34.988589446 +0000 UTC m=+82.092385856 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.063088 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.063140 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.063162 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.063190 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.063213 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:03Z","lastTransitionTime":"2025-12-05T15:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.089275 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.089528 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:56:35.089493662 +0000 UTC m=+82.193290072 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.089682 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.089799 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.089938 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.089939 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.089963 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.089979 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.089989 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.089996 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.090074 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 15:56:35.090048687 +0000 UTC m=+82.193845107 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.090102 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 15:56:35.090090238 +0000 UTC m=+82.193886648 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.166207 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.166276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.166291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.166326 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.166344 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:03Z","lastTransitionTime":"2025-12-05T15:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.249256 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.249337 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.249276 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.249512 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.249620 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.249772 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.269395 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.269456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.269475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.269502 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.269521 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:03Z","lastTransitionTime":"2025-12-05T15:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.275550 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.292555 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.292784 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:56:03 crc kubenswrapper[4778]: E1205 15:56:03.292873 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs podName:48cc0dd1-7387-4df1-aa6a-198ac40c620d nodeName:}" failed. No retries permitted until 2025-12-05 15:56:19.292849867 +0000 UTC m=+66.396646277 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs") pod "network-metrics-daemon-8tvxd" (UID: "48cc0dd1-7387-4df1-aa6a-198ac40c620d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.303064 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.320612 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.343613 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.361195 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.373060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.373115 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.373134 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.373162 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.373221 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:03Z","lastTransitionTime":"2025-12-05T15:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.381324 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.399498 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.414193 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.437761 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.449193 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.472629 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:00Z\\\",\\\"message\\\":\\\"opping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392006 6404 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392189 6404 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392396 6404 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392444 6404 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392485 6404 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392615 6404 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392634 6404 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.475870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.475924 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.475946 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.475973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.475992 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:03Z","lastTransitionTime":"2025-12-05T15:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.490285 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.512027 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.534555 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.553252 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.570783 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.579117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.579227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.579247 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.579270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.579297 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:03Z","lastTransitionTime":"2025-12-05T15:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.589922 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:03Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.689672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.689741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.689758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.689787 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.689806 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:03Z","lastTransitionTime":"2025-12-05T15:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.792797 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.792936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.792953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.792977 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.792996 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:03Z","lastTransitionTime":"2025-12-05T15:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.896352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.896462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.896483 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.896508 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.896528 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:03Z","lastTransitionTime":"2025-12-05T15:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.999549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.999639 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.999663 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:03 crc kubenswrapper[4778]: I1205 15:56:03.999692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:03.999714 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:03Z","lastTransitionTime":"2025-12-05T15:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.102700 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.102785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.102809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.102843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.102865 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:04Z","lastTransitionTime":"2025-12-05T15:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.206306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.206399 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.206412 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.206432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.206631 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:04Z","lastTransitionTime":"2025-12-05T15:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.249107 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:04 crc kubenswrapper[4778]: E1205 15:56:04.249409 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.309727 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.309791 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.309810 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.309837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.309857 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:04Z","lastTransitionTime":"2025-12-05T15:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.413331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.413447 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.413472 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.413506 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.413529 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:04Z","lastTransitionTime":"2025-12-05T15:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.517273 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.517344 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.517361 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.517423 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.517445 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:04Z","lastTransitionTime":"2025-12-05T15:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.620642 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.620720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.620735 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.620754 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.620767 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:04Z","lastTransitionTime":"2025-12-05T15:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.724063 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.724115 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.724126 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.724143 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.724155 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:04Z","lastTransitionTime":"2025-12-05T15:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.826844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.826900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.826913 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.826931 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.826991 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:04Z","lastTransitionTime":"2025-12-05T15:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.930241 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.930314 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.930334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.930397 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:04 crc kubenswrapper[4778]: I1205 15:56:04.930420 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:04Z","lastTransitionTime":"2025-12-05T15:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.032882 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.032963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.032982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.033014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.033034 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.118791 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.119162 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.119181 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.119205 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.119228 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: E1205 15:56:05.140145 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:05Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.146751 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.146837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.146863 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.146896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.146925 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: E1205 15:56:05.170566 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:05Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.176144 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.176204 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.176216 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.176236 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.176250 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: E1205 15:56:05.194104 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:05Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.198458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.198496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.198507 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.198524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.198539 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: E1205 15:56:05.213642 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:05Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.218118 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.218159 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.218171 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.218223 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.218236 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: E1205 15:56:05.232884 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:05Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:05 crc kubenswrapper[4778]: E1205 15:56:05.233125 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.234956 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.235013 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.235032 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.235055 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.235073 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.249607 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.249682 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.249613 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:05 crc kubenswrapper[4778]: E1205 15:56:05.249785 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:05 crc kubenswrapper[4778]: E1205 15:56:05.249969 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:05 crc kubenswrapper[4778]: E1205 15:56:05.250052 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.337815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.337877 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.337892 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.337916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.337932 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.440492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.440531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.440543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.440560 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.440571 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.543728 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.543821 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.543844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.543869 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.543890 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.647636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.648024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.648268 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.648511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.648727 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.751912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.751967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.751978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.751997 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.752011 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.854699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.854747 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.854757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.854774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.854785 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.957868 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.957915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.957923 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.957938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:05 crc kubenswrapper[4778]: I1205 15:56:05.957948 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:05Z","lastTransitionTime":"2025-12-05T15:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.061337 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.062467 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.062684 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.062912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.063076 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:06Z","lastTransitionTime":"2025-12-05T15:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.166520 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.167033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.167179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.167332 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.167566 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:06Z","lastTransitionTime":"2025-12-05T15:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.248944 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:06 crc kubenswrapper[4778]: E1205 15:56:06.249095 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.269910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.269966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.269984 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.270013 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.270033 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:06Z","lastTransitionTime":"2025-12-05T15:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.373799 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.373927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.373954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.373984 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.374007 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:06Z","lastTransitionTime":"2025-12-05T15:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.476794 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.476859 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.476881 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.476907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.476925 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:06Z","lastTransitionTime":"2025-12-05T15:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.580182 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.580541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.580725 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.580889 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.581031 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:06Z","lastTransitionTime":"2025-12-05T15:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.685228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.685282 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.685298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.685323 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.685340 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:06Z","lastTransitionTime":"2025-12-05T15:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.789784 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.789854 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.789873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.789900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.789919 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:06Z","lastTransitionTime":"2025-12-05T15:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.893436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.893511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.893541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.893573 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.893595 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:06Z","lastTransitionTime":"2025-12-05T15:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.996982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.997047 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.997071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.997115 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:06 crc kubenswrapper[4778]: I1205 15:56:06.997143 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:06Z","lastTransitionTime":"2025-12-05T15:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.100440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.100504 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.100522 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.100545 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.100565 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:07Z","lastTransitionTime":"2025-12-05T15:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.203950 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.204014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.204031 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.204054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.204073 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:07Z","lastTransitionTime":"2025-12-05T15:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.249232 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.249341 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:07 crc kubenswrapper[4778]: E1205 15:56:07.249502 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:07 crc kubenswrapper[4778]: E1205 15:56:07.249693 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.249900 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:07 crc kubenswrapper[4778]: E1205 15:56:07.249993 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.306806 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.306869 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.306889 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.306913 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.306930 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:07Z","lastTransitionTime":"2025-12-05T15:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.409647 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.409926 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.410119 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.410317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.410512 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:07Z","lastTransitionTime":"2025-12-05T15:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.513861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.514244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.514488 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.514700 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.514894 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:07Z","lastTransitionTime":"2025-12-05T15:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.618951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.619020 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.619043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.619074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.619098 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:07Z","lastTransitionTime":"2025-12-05T15:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.722745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.722796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.722811 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.722834 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.722851 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:07Z","lastTransitionTime":"2025-12-05T15:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.825695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.825763 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.825785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.825814 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.825835 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:07Z","lastTransitionTime":"2025-12-05T15:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.870987 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.891569 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.901713 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:07Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.924628 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:07Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.930512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.930583 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.930602 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.930635 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.930655 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:07Z","lastTransitionTime":"2025-12-05T15:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.939171 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:07Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.959815 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:07Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.974815 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:07Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:07 crc kubenswrapper[4778]: I1205 15:56:07.994964 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:07Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.012562 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:08Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.030139 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:08Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.034770 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.034892 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.034999 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.035039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.035065 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:08Z","lastTransitionTime":"2025-12-05T15:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.051533 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:08Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.106297 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:08Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.120478 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:08Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.138096 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.138151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.138168 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.138193 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.138209 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:08Z","lastTransitionTime":"2025-12-05T15:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.143671 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:00Z\\\",\\\"message\\\":\\\"opping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392006 6404 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392189 6404 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392396 6404 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392444 6404 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392485 6404 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392615 6404 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392634 6404 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:08Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.163089 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:08Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.182229 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:08Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.200707 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:08Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.218536 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:08Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.233009 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:08Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.241257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.241330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.241354 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.241410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.241434 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:08Z","lastTransitionTime":"2025-12-05T15:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.248692 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:08 crc kubenswrapper[4778]: E1205 15:56:08.248879 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.344790 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.344890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.344911 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.344940 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.344963 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:08Z","lastTransitionTime":"2025-12-05T15:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.448188 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.448254 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.448271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.448300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.448319 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:08Z","lastTransitionTime":"2025-12-05T15:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.551598 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.552051 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.552217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.552429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.552586 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:08Z","lastTransitionTime":"2025-12-05T15:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.656203 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.656311 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.656329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.656358 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.656430 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:08Z","lastTransitionTime":"2025-12-05T15:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.758636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.758674 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.758684 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.758701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.758713 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:08Z","lastTransitionTime":"2025-12-05T15:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.861482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.861965 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.862180 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.862421 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.862599 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:08Z","lastTransitionTime":"2025-12-05T15:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.965850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.965915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.965933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.965962 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:08 crc kubenswrapper[4778]: I1205 15:56:08.965981 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:08Z","lastTransitionTime":"2025-12-05T15:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.068747 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.068818 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.068837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.068867 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.068886 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:09Z","lastTransitionTime":"2025-12-05T15:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.172675 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.172743 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.172765 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.172794 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.172817 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:09Z","lastTransitionTime":"2025-12-05T15:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.249053 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.249087 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.249064 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:09 crc kubenswrapper[4778]: E1205 15:56:09.249262 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:09 crc kubenswrapper[4778]: E1205 15:56:09.249551 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:09 crc kubenswrapper[4778]: E1205 15:56:09.249652 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.276066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.276142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.276166 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.276196 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.276220 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:09Z","lastTransitionTime":"2025-12-05T15:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.379479 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.379552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.379570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.379597 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.379616 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:09Z","lastTransitionTime":"2025-12-05T15:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.483678 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.484098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.484325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.484615 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.484840 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:09Z","lastTransitionTime":"2025-12-05T15:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.587616 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.587953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.588250 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.588500 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.588684 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:09Z","lastTransitionTime":"2025-12-05T15:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.692642 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.692804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.692841 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.692876 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.692901 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:09Z","lastTransitionTime":"2025-12-05T15:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.796549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.796631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.796656 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.796685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.796707 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:09Z","lastTransitionTime":"2025-12-05T15:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.900255 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.900325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.900349 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.900407 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:09 crc kubenswrapper[4778]: I1205 15:56:09.900427 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:09Z","lastTransitionTime":"2025-12-05T15:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.003321 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.003412 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.003436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.003462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.003479 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:10Z","lastTransitionTime":"2025-12-05T15:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.107124 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.107197 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.107216 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.107241 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.107258 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:10Z","lastTransitionTime":"2025-12-05T15:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.210833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.210944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.210970 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.211004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.211029 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:10Z","lastTransitionTime":"2025-12-05T15:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.248731 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:10 crc kubenswrapper[4778]: E1205 15:56:10.248976 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.313946 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.314023 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.314051 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.314085 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.314108 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:10Z","lastTransitionTime":"2025-12-05T15:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.417603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.417677 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.417695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.417719 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.417737 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:10Z","lastTransitionTime":"2025-12-05T15:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.521275 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.521360 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.521410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.521442 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.521465 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:10Z","lastTransitionTime":"2025-12-05T15:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.625387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.625449 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.625468 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.625495 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.625515 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:10Z","lastTransitionTime":"2025-12-05T15:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.728008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.728068 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.728079 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.728098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.728110 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:10Z","lastTransitionTime":"2025-12-05T15:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.831334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.831456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.831487 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.831518 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.831543 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:10Z","lastTransitionTime":"2025-12-05T15:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.934079 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.934138 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.934155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.934178 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:10 crc kubenswrapper[4778]: I1205 15:56:10.934196 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:10Z","lastTransitionTime":"2025-12-05T15:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.036523 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.036589 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.036612 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.036640 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.036657 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:11Z","lastTransitionTime":"2025-12-05T15:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.139488 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.139547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.139563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.139587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.139601 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:11Z","lastTransitionTime":"2025-12-05T15:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.242991 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.243037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.243046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.243063 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.243072 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:11Z","lastTransitionTime":"2025-12-05T15:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.249315 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.249350 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:11 crc kubenswrapper[4778]: E1205 15:56:11.249439 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.249595 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:11 crc kubenswrapper[4778]: E1205 15:56:11.249590 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:11 crc kubenswrapper[4778]: E1205 15:56:11.249653 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.345848 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.345922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.345939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.345971 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.345990 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:11Z","lastTransitionTime":"2025-12-05T15:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.449276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.449334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.449351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.449404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.449429 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:11Z","lastTransitionTime":"2025-12-05T15:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.553110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.553187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.553212 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.553243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.553265 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:11Z","lastTransitionTime":"2025-12-05T15:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.656524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.656583 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.656596 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.656614 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.656628 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:11Z","lastTransitionTime":"2025-12-05T15:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.759837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.759893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.759909 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.759932 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.759950 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:11Z","lastTransitionTime":"2025-12-05T15:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.862909 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.862970 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.862988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.863012 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.863037 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:11Z","lastTransitionTime":"2025-12-05T15:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.966113 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.966177 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.966195 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.966219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:11 crc kubenswrapper[4778]: I1205 15:56:11.966237 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:11Z","lastTransitionTime":"2025-12-05T15:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.069001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.069054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.069071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.069095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.069114 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:12Z","lastTransitionTime":"2025-12-05T15:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.172955 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.173052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.173075 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.173104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.173124 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:12Z","lastTransitionTime":"2025-12-05T15:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.249080 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:12 crc kubenswrapper[4778]: E1205 15:56:12.249307 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.277199 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.277549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.277700 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.278121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.278334 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:12Z","lastTransitionTime":"2025-12-05T15:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.382550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.382619 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.382640 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.382663 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.382681 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:12Z","lastTransitionTime":"2025-12-05T15:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.492111 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.492180 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.492200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.492224 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.492243 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:12Z","lastTransitionTime":"2025-12-05T15:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.595102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.595171 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.595193 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.595220 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.595242 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:12Z","lastTransitionTime":"2025-12-05T15:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.697685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.697751 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.697769 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.697794 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.697811 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:12Z","lastTransitionTime":"2025-12-05T15:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.800516 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.800574 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.800590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.800614 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.800632 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:12Z","lastTransitionTime":"2025-12-05T15:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.903717 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.903791 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.903809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.903833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:12 crc kubenswrapper[4778]: I1205 15:56:12.903853 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:12Z","lastTransitionTime":"2025-12-05T15:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.006257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.006313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.006330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.006353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.006483 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:13Z","lastTransitionTime":"2025-12-05T15:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.143965 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.144024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.144033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.144057 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.144071 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:13Z","lastTransitionTime":"2025-12-05T15:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.247238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.247747 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.247940 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.248556 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.248803 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:13Z","lastTransitionTime":"2025-12-05T15:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.248827 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.248827 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:13 crc kubenswrapper[4778]: E1205 15:56:13.249180 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.248881 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:13 crc kubenswrapper[4778]: E1205 15:56:13.249351 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:13 crc kubenswrapper[4778]: E1205 15:56:13.249456 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.284053 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.298937 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.332775 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:00Z\\\",\\\"message\\\":\\\"opping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392006 6404 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392189 6404 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392396 6404 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392444 6404 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392485 6404 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392615 6404 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392634 6404 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.352042 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.354403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.354480 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.354503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.354536 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.354562 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:13Z","lastTransitionTime":"2025-12-05T15:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.373127 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.390279 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.403322 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.419302 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.436768 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.457440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.457511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.457530 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.457562 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.457584 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:13Z","lastTransitionTime":"2025-12-05T15:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.457804 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.475023 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.495914 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.518899 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.533736 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.553857 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.562672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.562714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.562731 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.562758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.562778 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:13Z","lastTransitionTime":"2025-12-05T15:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.566783 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.581920 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdf51cd-d9a1-4678-8a50-170bcd551c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46dfbfa7cc094d6fe37248fcff273d561a58dfce35ed231a303e4bd70cf0fbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8be3eb2a66916642073568cde0334019c613c661e21540feaeeaba10a686477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf9869dc28a932f93dc17e7251a9439cc0800654b657778a019904dea1065ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.602943 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:13Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.665584 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.665648 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.665660 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.665676 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.665690 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:13Z","lastTransitionTime":"2025-12-05T15:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.768823 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.768931 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.768950 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.768977 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.768996 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:13Z","lastTransitionTime":"2025-12-05T15:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.871651 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.871705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.871723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.871760 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.871779 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:13Z","lastTransitionTime":"2025-12-05T15:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.975326 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.975419 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.975441 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.975471 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:13 crc kubenswrapper[4778]: I1205 15:56:13.975492 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:13Z","lastTransitionTime":"2025-12-05T15:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.078596 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.078679 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.078702 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.078732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.078753 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:14Z","lastTransitionTime":"2025-12-05T15:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.184310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.184409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.184428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.184451 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.184469 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:14Z","lastTransitionTime":"2025-12-05T15:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.248795 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:14 crc kubenswrapper[4778]: E1205 15:56:14.249566 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.250000 4778 scope.go:117] "RemoveContainer" containerID="31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022" Dec 05 15:56:14 crc kubenswrapper[4778]: E1205 15:56:14.250283 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.288547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.288624 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.288647 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.288730 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.288819 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:14Z","lastTransitionTime":"2025-12-05T15:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.391654 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.391715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.391733 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.391759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.391777 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:14Z","lastTransitionTime":"2025-12-05T15:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.494783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.494826 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.494844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.494862 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.494873 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:14Z","lastTransitionTime":"2025-12-05T15:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.597832 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.597901 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.597918 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.597957 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.597976 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:14Z","lastTransitionTime":"2025-12-05T15:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.701439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.701532 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.701555 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.701591 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.701620 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:14Z","lastTransitionTime":"2025-12-05T15:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.805814 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.806074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.806092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.806117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.806136 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:14Z","lastTransitionTime":"2025-12-05T15:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.909622 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.909727 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.909748 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.909774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:14 crc kubenswrapper[4778]: I1205 15:56:14.909793 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:14Z","lastTransitionTime":"2025-12-05T15:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.012848 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.012913 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.012937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.012965 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.012983 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.116122 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.116192 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.116217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.116247 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.116272 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.219156 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.219208 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.219226 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.219249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.219266 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.253448 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:15 crc kubenswrapper[4778]: E1205 15:56:15.253640 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.254139 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:15 crc kubenswrapper[4778]: E1205 15:56:15.254252 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.254321 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:15 crc kubenswrapper[4778]: E1205 15:56:15.254503 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.322562 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.322646 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.322663 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.322686 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.322703 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.425302 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.425351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.425403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.425434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.425456 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.527818 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.527867 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.527882 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.527902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.527916 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.619944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.619973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.619981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.619994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.620003 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: E1205 15:56:15.631135 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:15Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.634956 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.634981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.634991 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.635007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.635018 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: E1205 15:56:15.646558 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:15Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.649852 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.649878 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.649887 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.649900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.649909 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: E1205 15:56:15.661311 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:15Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.665191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.665251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.665269 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.665293 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.665314 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: E1205 15:56:15.681014 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:15Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.684828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.684876 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.684894 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.684913 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.684931 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: E1205 15:56:15.703941 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:15Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:15 crc kubenswrapper[4778]: E1205 15:56:15.704161 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.706060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.706102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.706110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.706125 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.706135 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.807895 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.807958 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.807975 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.808000 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.808017 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.910616 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.910664 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.910685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.910706 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:15 crc kubenswrapper[4778]: I1205 15:56:15.910724 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:15Z","lastTransitionTime":"2025-12-05T15:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.014494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.014541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.014557 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.014575 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.014587 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:16Z","lastTransitionTime":"2025-12-05T15:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.117237 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.117282 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.117292 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.117310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.117322 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:16Z","lastTransitionTime":"2025-12-05T15:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.220441 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.220506 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.220524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.220550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.220568 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:16Z","lastTransitionTime":"2025-12-05T15:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.248607 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:16 crc kubenswrapper[4778]: E1205 15:56:16.248804 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.323179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.323258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.323283 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.323315 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.323337 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:16Z","lastTransitionTime":"2025-12-05T15:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.427134 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.427205 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.427225 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.427249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.427265 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:16Z","lastTransitionTime":"2025-12-05T15:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.530046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.530095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.530105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.530122 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.530131 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:16Z","lastTransitionTime":"2025-12-05T15:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.632496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.632554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.632566 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.632584 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.632597 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:16Z","lastTransitionTime":"2025-12-05T15:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.734998 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.735028 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.735038 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.735052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.735061 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:16Z","lastTransitionTime":"2025-12-05T15:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.838886 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.838942 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.838956 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.838974 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.838986 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:16Z","lastTransitionTime":"2025-12-05T15:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.942139 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.942186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.942222 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.942240 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:16 crc kubenswrapper[4778]: I1205 15:56:16.942250 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:16Z","lastTransitionTime":"2025-12-05T15:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.044810 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.044876 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.044894 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.044914 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.044926 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:17Z","lastTransitionTime":"2025-12-05T15:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.149098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.149168 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.149180 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.149201 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.149233 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:17Z","lastTransitionTime":"2025-12-05T15:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.248943 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.249051 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.249116 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:17 crc kubenswrapper[4778]: E1205 15:56:17.249234 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:17 crc kubenswrapper[4778]: E1205 15:56:17.249468 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:17 crc kubenswrapper[4778]: E1205 15:56:17.249628 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.252979 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.253020 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.253038 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.253066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.253085 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:17Z","lastTransitionTime":"2025-12-05T15:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.356771 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.356824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.356842 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.356870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.356889 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:17Z","lastTransitionTime":"2025-12-05T15:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.461951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.462097 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.462116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.462144 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.462161 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:17Z","lastTransitionTime":"2025-12-05T15:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.565801 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.565861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.565875 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.565898 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.565914 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:17Z","lastTransitionTime":"2025-12-05T15:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.668504 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.668558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.668572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.668593 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.668605 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:17Z","lastTransitionTime":"2025-12-05T15:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.771550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.771613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.771626 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.771651 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.771665 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:17Z","lastTransitionTime":"2025-12-05T15:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.874736 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.874778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.874787 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.874804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.874814 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:17Z","lastTransitionTime":"2025-12-05T15:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.977750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.977831 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.977853 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.977880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:17 crc kubenswrapper[4778]: I1205 15:56:17.977899 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:17Z","lastTransitionTime":"2025-12-05T15:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.081445 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.081479 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.081487 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.081499 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.081508 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:18Z","lastTransitionTime":"2025-12-05T15:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.184454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.184502 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.184511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.184528 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.184540 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:18Z","lastTransitionTime":"2025-12-05T15:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.249346 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:18 crc kubenswrapper[4778]: E1205 15:56:18.249515 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.287399 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.287450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.287463 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.287481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.287492 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:18Z","lastTransitionTime":"2025-12-05T15:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.390205 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.390257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.390273 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.390298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.390317 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:18Z","lastTransitionTime":"2025-12-05T15:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.493879 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.493927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.493942 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.493966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.493983 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:18Z","lastTransitionTime":"2025-12-05T15:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.596262 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.596308 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.596344 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.596390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.596404 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:18Z","lastTransitionTime":"2025-12-05T15:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.698873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.698920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.698934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.698956 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.698984 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:18Z","lastTransitionTime":"2025-12-05T15:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.800820 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.800861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.800872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.800888 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.800899 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:18Z","lastTransitionTime":"2025-12-05T15:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.904128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.904324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.904347 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.904408 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:18 crc kubenswrapper[4778]: I1205 15:56:18.904431 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:18Z","lastTransitionTime":"2025-12-05T15:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.007681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.007745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.007763 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.007790 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.007814 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:19Z","lastTransitionTime":"2025-12-05T15:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.111289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.111352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.111394 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.111421 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.111436 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:19Z","lastTransitionTime":"2025-12-05T15:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.214027 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.214094 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.214112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.214137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.214154 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:19Z","lastTransitionTime":"2025-12-05T15:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.248800 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:19 crc kubenswrapper[4778]: E1205 15:56:19.248989 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.250161 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:19 crc kubenswrapper[4778]: E1205 15:56:19.250601 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.250889 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:19 crc kubenswrapper[4778]: E1205 15:56:19.251142 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.301261 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:19 crc kubenswrapper[4778]: E1205 15:56:19.301443 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:56:19 crc kubenswrapper[4778]: E1205 15:56:19.301516 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs podName:48cc0dd1-7387-4df1-aa6a-198ac40c620d nodeName:}" failed. No retries permitted until 2025-12-05 15:56:51.301496306 +0000 UTC m=+98.405292696 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs") pod "network-metrics-daemon-8tvxd" (UID: "48cc0dd1-7387-4df1-aa6a-198ac40c620d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.317053 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.317112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.317138 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.317163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.317182 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:19Z","lastTransitionTime":"2025-12-05T15:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.420882 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.421429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.421510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.421581 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.421646 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:19Z","lastTransitionTime":"2025-12-05T15:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.524178 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.524242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.524260 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.524286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.524303 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:19Z","lastTransitionTime":"2025-12-05T15:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.627060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.627107 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.627116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.627137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.627146 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:19Z","lastTransitionTime":"2025-12-05T15:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.722109 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrqmz_9b26d99a-f08e-41d1-b35c-5da99cbe3fb4/kube-multus/0.log" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.722577 4778 generic.go:334] "Generic (PLEG): container finished" podID="9b26d99a-f08e-41d1-b35c-5da99cbe3fb4" containerID="c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b" exitCode=1 Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.722620 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrqmz" event={"ID":"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4","Type":"ContainerDied","Data":"c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b"} Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.723163 4778 scope.go:117] "RemoveContainer" containerID="c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.731200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.731236 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.731247 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.731273 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.731285 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:19Z","lastTransitionTime":"2025-12-05T15:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.738842 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.757189 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.774938 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.788413 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.803706 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.823313 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:19Z\\\",\\\"message\\\":\\\"2025-12-05T15:55:34+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874\\\\n2025-12-05T15:55:34+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874 to /host/opt/cni/bin/\\\\n2025-12-05T15:55:34Z [verbose] multus-daemon started\\\\n2025-12-05T15:55:34Z [verbose] Readiness Indicator file check\\\\n2025-12-05T15:56:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.835157 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.835215 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.835232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.835254 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.835271 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:19Z","lastTransitionTime":"2025-12-05T15:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.843125 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.853568 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.865256 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdf51cd-d9a1-4678-8a50-170bcd551c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46dfbfa7cc094d6fe37248fcff273d561a58dfce35ed231a303e4bd70cf0fbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8be3eb2a66916642073568cde0334019c613c661e21540feaeeaba10a686477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf9869dc28a932f93dc17e7251a9439cc0800654b657778a019904dea1065ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.878501 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.890581 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.906870 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.921730 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.935213 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.937963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.938034 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.938053 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.938081 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.938097 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:19Z","lastTransitionTime":"2025-12-05T15:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.967711 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:19 crc kubenswrapper[4778]: I1205 15:56:19.979747 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:19Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.009932 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:00Z\\\",\\\"message\\\":\\\"opping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392006 6404 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392189 6404 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392396 6404 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392444 6404 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392485 6404 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392615 6404 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392634 6404 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.021064 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.040880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.040929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.040967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.040986 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.040998 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:20Z","lastTransitionTime":"2025-12-05T15:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.143549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.143633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.143643 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.143673 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.143686 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:20Z","lastTransitionTime":"2025-12-05T15:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.245978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.246032 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.246046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.246065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.246078 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:20Z","lastTransitionTime":"2025-12-05T15:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.249463 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:20 crc kubenswrapper[4778]: E1205 15:56:20.249614 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.349353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.349405 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.349416 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.349430 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.349440 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:20Z","lastTransitionTime":"2025-12-05T15:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.452140 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.452216 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.452233 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.452258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.452280 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:20Z","lastTransitionTime":"2025-12-05T15:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.555020 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.555076 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.555092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.555115 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.555133 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:20Z","lastTransitionTime":"2025-12-05T15:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.657990 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.658183 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.658223 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.658244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.658258 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:20Z","lastTransitionTime":"2025-12-05T15:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.732528 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrqmz_9b26d99a-f08e-41d1-b35c-5da99cbe3fb4/kube-multus/0.log" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.732590 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrqmz" event={"ID":"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4","Type":"ContainerStarted","Data":"14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3"} Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.754081 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:00Z\\\",\\\"message\\\":\\\"opping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392006 6404 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392189 6404 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392396 6404 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392444 6404 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392485 6404 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392615 6404 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392634 6404 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.762816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.762854 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.762869 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.762890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.762904 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:20Z","lastTransitionTime":"2025-12-05T15:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.768131 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.783592 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.819599 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.834331 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.849316 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.864936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.865152 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.865187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.865252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.865271 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:20Z","lastTransitionTime":"2025-12-05T15:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.866741 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.881425 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.895048 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.905965 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.923342 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:19Z\\\",\\\"message\\\":\\\"2025-12-05T15:55:34+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874\\\\n2025-12-05T15:55:34+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874 to /host/opt/cni/bin/\\\\n2025-12-05T15:55:34Z [verbose] multus-daemon started\\\\n2025-12-05T15:55:34Z [verbose] Readiness Indicator file check\\\\n2025-12-05T15:56:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.938387 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.947212 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.964482 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.967305 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.967399 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.967419 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.967446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.967466 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:20Z","lastTransitionTime":"2025-12-05T15:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.978925 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdf51cd-d9a1-4678-8a50-170bcd551c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46dfbfa7cc094d6fe37248fcff273d561a58dfce35ed231a303e4bd70cf0fbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8be3eb2a66916642073568cde0334019c613c661e21540feaeeaba10a686477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf9869dc28a932f93dc17e7251a9439cc0800654b657778a019904dea1065ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:20 crc kubenswrapper[4778]: I1205 15:56:20.995326 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:20Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.008953 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:21Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.022872 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:21Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.070670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.070723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.070742 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.070768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.070788 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:21Z","lastTransitionTime":"2025-12-05T15:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.173461 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.173510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.173523 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.173541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.173553 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:21Z","lastTransitionTime":"2025-12-05T15:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.249431 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.249438 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.249716 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:21 crc kubenswrapper[4778]: E1205 15:56:21.249719 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:21 crc kubenswrapper[4778]: E1205 15:56:21.249767 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:21 crc kubenswrapper[4778]: E1205 15:56:21.249546 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.276488 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.276519 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.276531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.276547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.276560 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:21Z","lastTransitionTime":"2025-12-05T15:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.378829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.378903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.378920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.378947 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.378969 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:21Z","lastTransitionTime":"2025-12-05T15:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.481564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.481612 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.481622 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.481637 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.481648 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:21Z","lastTransitionTime":"2025-12-05T15:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.584018 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.584051 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.584060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.584075 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.584086 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:21Z","lastTransitionTime":"2025-12-05T15:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.685906 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.685945 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.685953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.685966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.685998 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:21Z","lastTransitionTime":"2025-12-05T15:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.788850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.788891 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.788903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.788919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.788931 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:21Z","lastTransitionTime":"2025-12-05T15:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.892270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.892318 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.892328 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.892342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.892353 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:21Z","lastTransitionTime":"2025-12-05T15:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.995049 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.995084 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.995095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.995110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:21 crc kubenswrapper[4778]: I1205 15:56:21.995120 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:21Z","lastTransitionTime":"2025-12-05T15:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.097970 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.098047 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.098071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.098102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.098127 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:22Z","lastTransitionTime":"2025-12-05T15:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.200781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.200814 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.200823 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.200837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.200846 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:22Z","lastTransitionTime":"2025-12-05T15:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.249126 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:22 crc kubenswrapper[4778]: E1205 15:56:22.249410 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.303714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.303784 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.303800 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.303828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.303845 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:22Z","lastTransitionTime":"2025-12-05T15:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.407054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.407111 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.407127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.407151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.407167 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:22Z","lastTransitionTime":"2025-12-05T15:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.509437 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.509475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.509486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.509502 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.509512 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:22Z","lastTransitionTime":"2025-12-05T15:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.612280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.612351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.612379 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.612396 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.612407 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:22Z","lastTransitionTime":"2025-12-05T15:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.715865 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.715921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.715932 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.715947 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.715960 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:22Z","lastTransitionTime":"2025-12-05T15:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.819673 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.819733 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.819748 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.819770 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.819784 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:22Z","lastTransitionTime":"2025-12-05T15:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.922430 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.922503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.922521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.922550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:22 crc kubenswrapper[4778]: I1205 15:56:22.922569 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:22Z","lastTransitionTime":"2025-12-05T15:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.028431 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.028940 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.029143 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.030099 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.030169 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:23Z","lastTransitionTime":"2025-12-05T15:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.133596 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.133677 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.133703 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.133738 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.133763 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:23Z","lastTransitionTime":"2025-12-05T15:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.266169 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.266347 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.266571 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:23 crc kubenswrapper[4778]: E1205 15:56:23.266553 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:23 crc kubenswrapper[4778]: E1205 15:56:23.266983 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:23 crc kubenswrapper[4778]: E1205 15:56:23.267467 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.268287 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.268325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.268337 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.268351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.268381 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:23Z","lastTransitionTime":"2025-12-05T15:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.287860 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.308931 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.326473 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.346998 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdf51cd-d9a1-4678-8a50-170bcd551c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46dfbfa7cc094d6fe37248fcff273d561a58dfce35ed231a303e4bd70cf0fbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8be3eb2a66916642073568cde0334019c613c661e21540feaeeaba10a686477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf9869dc28a932f93dc17e7251a9439cc0800654b657778a019904dea1065ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.362670 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.373145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.373196 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.373211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.373237 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.373254 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:23Z","lastTransitionTime":"2025-12-05T15:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.386724 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.402607 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.425390 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:00Z\\\",\\\"message\\\":\\\"opping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392006 6404 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392189 6404 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392396 6404 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392444 6404 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392485 6404 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392615 6404 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392634 6404 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.436959 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.451168 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.466730 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.475761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.475836 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.475849 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.475872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.475888 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:23Z","lastTransitionTime":"2025-12-05T15:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.481621 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.499980 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.518928 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.537285 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.549335 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.568426 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:19Z\\\",\\\"message\\\":\\\"2025-12-05T15:55:34+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874\\\\n2025-12-05T15:55:34+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874 to /host/opt/cni/bin/\\\\n2025-12-05T15:55:34Z [verbose] multus-daemon started\\\\n2025-12-05T15:55:34Z [verbose] Readiness Indicator file check\\\\n2025-12-05T15:56:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.578997 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.579052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.579064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.579087 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.579104 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:23Z","lastTransitionTime":"2025-12-05T15:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.592341 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:23Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.681832 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.681870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.681881 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.681895 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.681905 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:23Z","lastTransitionTime":"2025-12-05T15:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.783757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.783836 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.783860 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.783891 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.783916 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:23Z","lastTransitionTime":"2025-12-05T15:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.886541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.886590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.886607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.886630 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.886646 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:23Z","lastTransitionTime":"2025-12-05T15:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.989812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.989887 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.989901 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.989921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:23 crc kubenswrapper[4778]: I1205 15:56:23.989942 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:23Z","lastTransitionTime":"2025-12-05T15:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.093284 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.093330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.093340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.093356 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.093379 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:24Z","lastTransitionTime":"2025-12-05T15:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.195972 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.196029 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.196045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.196071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.196086 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:24Z","lastTransitionTime":"2025-12-05T15:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.248730 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:24 crc kubenswrapper[4778]: E1205 15:56:24.248870 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.298528 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.298590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.298612 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.298636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.298654 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:24Z","lastTransitionTime":"2025-12-05T15:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.400897 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.400948 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.400958 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.400973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.400984 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:24Z","lastTransitionTime":"2025-12-05T15:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.503454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.503513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.503524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.503543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.503555 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:24Z","lastTransitionTime":"2025-12-05T15:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.606110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.606179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.606195 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.606219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.606238 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:24Z","lastTransitionTime":"2025-12-05T15:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.709258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.709354 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.709402 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.709430 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.709449 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:24Z","lastTransitionTime":"2025-12-05T15:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.812816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.812870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.812883 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.812901 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.812918 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:24Z","lastTransitionTime":"2025-12-05T15:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.916215 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.916284 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.916303 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.916329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:24 crc kubenswrapper[4778]: I1205 15:56:24.916348 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:24Z","lastTransitionTime":"2025-12-05T15:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.019312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.019392 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.019409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.019429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.019443 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.122147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.122224 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.122242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.122268 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.122286 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.225234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.225294 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.225311 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.225336 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.225354 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.249347 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.249434 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.249470 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:25 crc kubenswrapper[4778]: E1205 15:56:25.249558 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:25 crc kubenswrapper[4778]: E1205 15:56:25.249719 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:25 crc kubenswrapper[4778]: E1205 15:56:25.249835 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.328477 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.328512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.328523 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.328538 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.328549 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.430789 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.430839 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.430852 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.430870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.430882 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.534660 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.534725 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.534746 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.534775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.534796 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.637803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.637861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.637873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.637889 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.637898 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.741862 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.741902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.741912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.741930 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.741940 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.805324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.805404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.805418 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.805437 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.805452 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: E1205 15:56:25.824213 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:25Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.828258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.828309 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.828322 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.828342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.828354 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: E1205 15:56:25.842026 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:25Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.846249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.846301 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.846320 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.846346 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.846401 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: E1205 15:56:25.863864 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:25Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.867910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.867984 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.868007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.868037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.868059 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: E1205 15:56:25.887441 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:25Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.891067 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.891106 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.891118 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.891133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.891145 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:25 crc kubenswrapper[4778]: E1205 15:56:25.908666 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:25Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:25 crc kubenswrapper[4778]: E1205 15:56:25.909332 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.911676 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.911708 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.911720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.911740 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:25 crc kubenswrapper[4778]: I1205 15:56:25.911751 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:25Z","lastTransitionTime":"2025-12-05T15:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.014636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.014699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.014717 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.014742 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.014761 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:26Z","lastTransitionTime":"2025-12-05T15:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.117095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.117174 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.117191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.117607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.117650 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:26Z","lastTransitionTime":"2025-12-05T15:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.219994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.220024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.220034 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.220050 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.220061 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:26Z","lastTransitionTime":"2025-12-05T15:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.249006 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:26 crc kubenswrapper[4778]: E1205 15:56:26.249189 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.323086 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.323164 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.323176 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.323197 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.323208 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:26Z","lastTransitionTime":"2025-12-05T15:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.426548 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.426663 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.426681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.426707 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.426727 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:26Z","lastTransitionTime":"2025-12-05T15:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.529661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.529701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.529710 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.529723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.529734 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:26Z","lastTransitionTime":"2025-12-05T15:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.633714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.633804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.633819 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.633846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.633864 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:26Z","lastTransitionTime":"2025-12-05T15:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.737452 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.737513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.737527 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.737551 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.737566 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:26Z","lastTransitionTime":"2025-12-05T15:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.840614 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.840713 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.840738 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.840776 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.840797 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:26Z","lastTransitionTime":"2025-12-05T15:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.943482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.943551 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.943568 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.943594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:26 crc kubenswrapper[4778]: I1205 15:56:26.943616 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:26Z","lastTransitionTime":"2025-12-05T15:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.046184 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.046245 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.046265 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.046289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.046307 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:27Z","lastTransitionTime":"2025-12-05T15:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.149114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.149200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.149228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.149258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.149275 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:27Z","lastTransitionTime":"2025-12-05T15:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.249309 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.249423 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:27 crc kubenswrapper[4778]: E1205 15:56:27.249602 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.249632 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:27 crc kubenswrapper[4778]: E1205 15:56:27.249820 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:27 crc kubenswrapper[4778]: E1205 15:56:27.250582 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.251953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.252003 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.252022 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.252053 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.252077 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:27Z","lastTransitionTime":"2025-12-05T15:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.355679 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.355745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.355763 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.355787 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.355807 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:27Z","lastTransitionTime":"2025-12-05T15:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.458243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.458462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.458481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.458506 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.458524 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:27Z","lastTransitionTime":"2025-12-05T15:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.561413 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.561491 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.561515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.561547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.561566 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:27Z","lastTransitionTime":"2025-12-05T15:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.664916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.664982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.664999 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.665025 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.665043 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:27Z","lastTransitionTime":"2025-12-05T15:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.767999 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.768058 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.768074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.768098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.768115 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:27Z","lastTransitionTime":"2025-12-05T15:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.870233 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.870297 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.870315 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.870340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.870357 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:27Z","lastTransitionTime":"2025-12-05T15:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.973583 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.973657 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.973675 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.973701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:27 crc kubenswrapper[4778]: I1205 15:56:27.973723 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:27Z","lastTransitionTime":"2025-12-05T15:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.077336 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.077433 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.077457 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.077488 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.077511 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:28Z","lastTransitionTime":"2025-12-05T15:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.180593 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.180650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.180667 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.180690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.180707 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:28Z","lastTransitionTime":"2025-12-05T15:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.248739 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:28 crc kubenswrapper[4778]: E1205 15:56:28.248964 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.249953 4778 scope.go:117] "RemoveContainer" containerID="31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.285138 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.285193 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.285211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.285234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.285251 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:28Z","lastTransitionTime":"2025-12-05T15:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.388157 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.388249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.388306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.388554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.388576 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:28Z","lastTransitionTime":"2025-12-05T15:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.492411 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.492478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.492494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.492520 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.492538 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:28Z","lastTransitionTime":"2025-12-05T15:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.597057 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.597146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.597167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.597192 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.597251 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:28Z","lastTransitionTime":"2025-12-05T15:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.700355 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.700439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.700455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.700478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.700497 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:28Z","lastTransitionTime":"2025-12-05T15:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.767217 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/2.log" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.771742 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751"} Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.772442 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.787806 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:28Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.802672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.802752 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.802766 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.802785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.802798 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:28Z","lastTransitionTime":"2025-12-05T15:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.828083 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:28Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.845217 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:28Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.870786 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:00Z\\\",\\\"message\\\":\\\"opping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392006 6404 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392189 6404 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392396 6404 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392444 6404 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392485 6404 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392615 6404 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392634 6404 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:28Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.882565 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:28Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.905798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.905879 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.905903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.905933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.905957 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:28Z","lastTransitionTime":"2025-12-05T15:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.918608 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:28Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.936383 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:28Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.950487 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:28Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.961069 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:28Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.972887 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:28Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:28 crc kubenswrapper[4778]: I1205 15:56:28.985516 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:19Z\\\",\\\"message\\\":\\\"2025-12-05T15:55:34+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874\\\\n2025-12-05T15:55:34+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874 to /host/opt/cni/bin/\\\\n2025-12-05T15:55:34Z [verbose] multus-daemon started\\\\n2025-12-05T15:55:34Z [verbose] Readiness Indicator file check\\\\n2025-12-05T15:56:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:28Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.001185 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:28Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.008138 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.008186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.008200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.008216 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.008226 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:29Z","lastTransitionTime":"2025-12-05T15:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.014421 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.025886 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdf51cd-d9a1-4678-8a50-170bcd551c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46dfbfa7cc094d6fe37248fcff273d561a58dfce35ed231a303e4bd70cf0fbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8be3eb2a66916642073568cde0334019c613c661e21540feaeeaba10a686477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf9869dc28a932f93dc17e7251a9439cc0800654b657778a019904dea1065ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.040976 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.059284 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.071731 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.086941 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.109915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.109957 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.109967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.109985 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.109999 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:29Z","lastTransitionTime":"2025-12-05T15:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.212745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.212782 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.212793 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.212809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.212819 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:29Z","lastTransitionTime":"2025-12-05T15:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.248856 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.248893 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.248857 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:29 crc kubenswrapper[4778]: E1205 15:56:29.248991 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:29 crc kubenswrapper[4778]: E1205 15:56:29.249051 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:29 crc kubenswrapper[4778]: E1205 15:56:29.249123 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.315908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.315962 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.315973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.315987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.315996 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:29Z","lastTransitionTime":"2025-12-05T15:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.419417 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.419497 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.419519 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.419550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.419577 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:29Z","lastTransitionTime":"2025-12-05T15:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.522705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.522751 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.522760 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.522776 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.522805 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:29Z","lastTransitionTime":"2025-12-05T15:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.625645 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.625721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.625740 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.625767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.625785 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:29Z","lastTransitionTime":"2025-12-05T15:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.729783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.729853 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.729870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.729896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.729914 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:29Z","lastTransitionTime":"2025-12-05T15:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.779011 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/3.log" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.780130 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/2.log" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.783856 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" exitCode=1 Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.783920 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751"} Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.783974 4778 scope.go:117] "RemoveContainer" containerID="31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.785097 4778 scope.go:117] "RemoveContainer" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" Dec 05 15:56:29 crc kubenswrapper[4778]: E1205 15:56:29.785529 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.812890 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.832603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.832654 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.832671 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.832695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.832715 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:29Z","lastTransitionTime":"2025-12-05T15:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.835984 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.859496 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.880921 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.909201 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.929193 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:19Z\\\",\\\"message\\\":\\\"2025-12-05T15:55:34+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874\\\\n2025-12-05T15:55:34+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874 to /host/opt/cni/bin/\\\\n2025-12-05T15:55:34Z [verbose] multus-daemon started\\\\n2025-12-05T15:55:34Z [verbose] Readiness Indicator file check\\\\n2025-12-05T15:56:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.940924 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.940998 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.941021 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.941044 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.941063 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:29Z","lastTransitionTime":"2025-12-05T15:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.951791 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.966777 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:29 crc kubenswrapper[4778]: I1205 15:56:29.983047 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.000411 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdf51cd-d9a1-4678-8a50-170bcd551c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46dfbfa7cc094d6fe37248fcff273d561a58dfce35ed231a303e4bd70cf0fbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8be3eb2a66916642073568cde0334019c613c661e21540feaeeaba10a686477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf9869dc28a932f93dc17e7251a9439cc0800654b657778a019904dea1065ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:29Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.018836 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.032273 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.043489 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.043540 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.043554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.043573 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.043586 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:30Z","lastTransitionTime":"2025-12-05T15:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.046168 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.077197 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f2877bedcd4ee5d818aa636ec47f2db5f6236f3333adf90c68697f66f22022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:00Z\\\",\\\"message\\\":\\\"opping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392006 6404 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392189 6404 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392396 6404 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392444 6404 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:00.392485 6404 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392615 6404 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 15:56:00.392634 6404 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:29Z\\\",\\\"message\\\":\\\".333395 6771 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:29.333396 6771 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:29.334516 6771 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 15:56:29.334606 6771 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 15:56:29.334623 6771 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 15:56:29.334632 6771 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 15:56:29.334645 6771 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 15:56:29.334645 6771 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 15:56:29.334654 6771 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 15:56:29.334689 6771 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 15:56:29.334699 6771 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 15:56:29.334730 6771 factory.go:656] Stopping watch factory\\\\nI1205 15:56:29.334753 6771 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.090015 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.110008 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.145988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.146027 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.146040 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.146056 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.146067 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:30Z","lastTransitionTime":"2025-12-05T15:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.169754 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.189566 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.248996 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:30 crc kubenswrapper[4778]: E1205 15:56:30.249202 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.249748 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.249795 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.249812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.249834 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.249852 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:30Z","lastTransitionTime":"2025-12-05T15:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.353458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.353515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.353531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.353552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.353570 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:30Z","lastTransitionTime":"2025-12-05T15:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.455857 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.455897 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.455906 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.455919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.455931 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:30Z","lastTransitionTime":"2025-12-05T15:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.558585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.558635 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.558652 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.558674 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.558688 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:30Z","lastTransitionTime":"2025-12-05T15:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.661739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.661850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.661880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.661915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.661940 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:30Z","lastTransitionTime":"2025-12-05T15:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.764929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.764983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.764997 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.765018 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.765043 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:30Z","lastTransitionTime":"2025-12-05T15:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.788212 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/3.log" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.793059 4778 scope.go:117] "RemoveContainer" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" Dec 05 15:56:30 crc kubenswrapper[4778]: E1205 15:56:30.793470 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.808575 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.828448 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.842057 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.858272 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.868455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.868575 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.868601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.868630 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.868649 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:30Z","lastTransitionTime":"2025-12-05T15:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.874973 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.894016 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:19Z\\\",\\\"message\\\":\\\"2025-12-05T15:55:34+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874\\\\n2025-12-05T15:55:34+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874 to /host/opt/cni/bin/\\\\n2025-12-05T15:55:34Z [verbose] multus-daemon started\\\\n2025-12-05T15:55:34Z [verbose] Readiness Indicator file check\\\\n2025-12-05T15:56:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.913962 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.928939 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.947029 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdf51cd-d9a1-4678-8a50-170bcd551c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46dfbfa7cc094d6fe37248fcff273d561a58dfce35ed231a303e4bd70cf0fbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8be3eb2a66916642073568cde0334019c613c661e21540feaeeaba10a686477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf9869dc28a932f93dc17e7251a9439cc0800654b657778a019904dea1065ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.966457 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.971519 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.971585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.971611 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.971675 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.971703 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:30Z","lastTransitionTime":"2025-12-05T15:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:30 crc kubenswrapper[4778]: I1205 15:56:30.988083 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:30Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.005640 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:31Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.021001 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:31Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.037325 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:31Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.061254 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:31Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.075009 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.075139 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.075331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.075361 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.075534 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:31Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.075482 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:31Z","lastTransitionTime":"2025-12-05T15:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.095928 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:29Z\\\",\\\"message\\\":\\\".333395 6771 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:29.333396 6771 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:29.334516 6771 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 15:56:29.334606 6771 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 15:56:29.334623 6771 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 15:56:29.334632 6771 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 15:56:29.334645 6771 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 15:56:29.334645 6771 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 15:56:29.334654 6771 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 15:56:29.334689 6771 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 15:56:29.334699 6771 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 15:56:29.334730 6771 factory.go:656] Stopping watch factory\\\\nI1205 15:56:29.334753 6771 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:56:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:31Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.108951 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:31Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.178150 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.178194 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.178207 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.178227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.178242 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:31Z","lastTransitionTime":"2025-12-05T15:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.248591 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:31 crc kubenswrapper[4778]: E1205 15:56:31.248762 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.248984 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:31 crc kubenswrapper[4778]: E1205 15:56:31.249046 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.249271 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:31 crc kubenswrapper[4778]: E1205 15:56:31.249347 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.281120 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.281174 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.281185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.281201 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.281210 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:31Z","lastTransitionTime":"2025-12-05T15:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.383559 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.383614 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.383632 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.383656 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.383673 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:31Z","lastTransitionTime":"2025-12-05T15:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.487576 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.487999 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.488143 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.488305 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.488505 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:31Z","lastTransitionTime":"2025-12-05T15:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.591540 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.591600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.591613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.591636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.591651 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:31Z","lastTransitionTime":"2025-12-05T15:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.695471 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.695538 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.695564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.695592 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.695617 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:31Z","lastTransitionTime":"2025-12-05T15:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.798136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.798295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.798319 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.798342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.798359 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:31Z","lastTransitionTime":"2025-12-05T15:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.902000 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.902066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.902087 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.902115 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:31 crc kubenswrapper[4778]: I1205 15:56:31.902136 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:31Z","lastTransitionTime":"2025-12-05T15:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.005494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.005559 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.005590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.005620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.005638 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:32Z","lastTransitionTime":"2025-12-05T15:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.109234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.109699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.109882 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.110073 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.110214 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:32Z","lastTransitionTime":"2025-12-05T15:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.212996 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.213074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.213096 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.213122 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.213199 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:32Z","lastTransitionTime":"2025-12-05T15:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.249123 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:32 crc kubenswrapper[4778]: E1205 15:56:32.249301 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.316308 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.316361 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.316394 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.316417 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.316434 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:32Z","lastTransitionTime":"2025-12-05T15:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.419813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.420189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.420317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.420466 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.420613 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:32Z","lastTransitionTime":"2025-12-05T15:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.524856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.525358 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.525570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.525771 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.525957 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:32Z","lastTransitionTime":"2025-12-05T15:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.629989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.630070 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.630096 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.630126 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.630153 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:32Z","lastTransitionTime":"2025-12-05T15:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.733797 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.734164 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.734661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.734925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.735137 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:32Z","lastTransitionTime":"2025-12-05T15:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.839655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.839715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.839733 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.839757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.839776 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:32Z","lastTransitionTime":"2025-12-05T15:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.943277 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.943591 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.943626 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.943658 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:32 crc kubenswrapper[4778]: I1205 15:56:32.943684 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:32Z","lastTransitionTime":"2025-12-05T15:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.046800 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.046859 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.046879 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.046907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.046929 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:33Z","lastTransitionTime":"2025-12-05T15:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.150359 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.150455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.150471 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.150497 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.150513 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:33Z","lastTransitionTime":"2025-12-05T15:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.248968 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:33 crc kubenswrapper[4778]: E1205 15:56:33.249168 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.249199 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.249332 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:33 crc kubenswrapper[4778]: E1205 15:56:33.249345 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:33 crc kubenswrapper[4778]: E1205 15:56:33.250804 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.260702 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.260770 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.260792 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.260822 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.260845 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:33Z","lastTransitionTime":"2025-12-05T15:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.280351 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.299024 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.322157 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:19Z\\\",\\\"message\\\":\\\"2025-12-05T15:55:34+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874\\\\n2025-12-05T15:55:34+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874 to /host/opt/cni/bin/\\\\n2025-12-05T15:55:34Z [verbose] multus-daemon started\\\\n2025-12-05T15:55:34Z [verbose] Readiness Indicator file check\\\\n2025-12-05T15:56:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.339238 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.389244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.389303 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.389322 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.389348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.389395 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:33Z","lastTransitionTime":"2025-12-05T15:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.394225 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.415411 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.434900 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.453898 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdf51cd-d9a1-4678-8a50-170bcd551c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46dfbfa7cc094d6fe37248fcff273d561a58dfce35ed231a303e4bd70cf0fbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8be3eb2a66916642073568cde0334019c613c661e21540feaeeaba10a686477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf9869dc28a932f93dc17e7251a9439cc0800654b657778a019904dea1065ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.486696 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.493978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.494039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.494059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.494086 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.494105 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:33Z","lastTransitionTime":"2025-12-05T15:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.503016 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.532714 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:29Z\\\",\\\"message\\\":\\\".333395 6771 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:29.333396 6771 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:29.334516 6771 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 15:56:29.334606 6771 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 15:56:29.334623 6771 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 15:56:29.334632 6771 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 15:56:29.334645 6771 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 15:56:29.334645 6771 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 15:56:29.334654 6771 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 15:56:29.334689 6771 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 15:56:29.334699 6771 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 15:56:29.334730 6771 factory.go:656] Stopping watch factory\\\\nI1205 15:56:29.334753 6771 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:56:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.548174 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.568537 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.588556 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.597359 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.597538 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.597562 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.597595 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.597619 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:33Z","lastTransitionTime":"2025-12-05T15:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.609939 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.628225 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.645633 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.671569 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:33Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.699558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.699607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.699620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.699637 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.699651 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:33Z","lastTransitionTime":"2025-12-05T15:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.803056 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.803142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.803169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.803203 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.803229 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:33Z","lastTransitionTime":"2025-12-05T15:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.906300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.906359 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.906422 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.906443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:33 crc kubenswrapper[4778]: I1205 15:56:33.906457 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:33Z","lastTransitionTime":"2025-12-05T15:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.008344 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.008446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.008465 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.008490 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.008508 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:34Z","lastTransitionTime":"2025-12-05T15:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.111226 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.111293 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.111312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.111339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.111356 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:34Z","lastTransitionTime":"2025-12-05T15:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.214325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.214424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.214443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.214467 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.214486 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:34Z","lastTransitionTime":"2025-12-05T15:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.249158 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:34 crc kubenswrapper[4778]: E1205 15:56:34.249354 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.317177 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.317495 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.317563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.317625 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.317702 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:34Z","lastTransitionTime":"2025-12-05T15:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.421642 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.421703 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.421721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.421745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.421763 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:34Z","lastTransitionTime":"2025-12-05T15:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.524353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.524445 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.524463 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.524485 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.524504 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:34Z","lastTransitionTime":"2025-12-05T15:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.628089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.628130 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.628141 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.628159 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.628173 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:34Z","lastTransitionTime":"2025-12-05T15:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.731056 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.731126 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.731144 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.731172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.731191 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:34Z","lastTransitionTime":"2025-12-05T15:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.834483 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.834563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.834581 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.834610 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.834627 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:34Z","lastTransitionTime":"2025-12-05T15:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.937807 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.937872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.937889 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.937914 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:34 crc kubenswrapper[4778]: I1205 15:56:34.937932 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:34Z","lastTransitionTime":"2025-12-05T15:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.000050 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.000196 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.000213 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.000333 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:57:39.000302266 +0000 UTC m=+146.104098706 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.000421 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.000498 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 15:57:39.000473351 +0000 UTC m=+146.104269761 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.042070 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.042195 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.042211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.042232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.042248 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:35Z","lastTransitionTime":"2025-12-05T15:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.100920 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.101045 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.101093 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.101210 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:39.101172794 +0000 UTC m=+146.204969214 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.101228 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.101276 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.101286 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.101296 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.101310 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.101311 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.101414 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 15:57:39.101358579 +0000 UTC m=+146.205154999 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.101443 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 15:57:39.101427141 +0000 UTC m=+146.205223611 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.145537 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.145601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.145618 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.145646 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.145665 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:35Z","lastTransitionTime":"2025-12-05T15:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.248508 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.248537 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.248707 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.248753 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.248776 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.248841 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.248860 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.248884 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.248902 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:35Z","lastTransitionTime":"2025-12-05T15:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.248916 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:35 crc kubenswrapper[4778]: E1205 15:56:35.249107 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.351968 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.352069 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.352110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.352270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.352307 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:35Z","lastTransitionTime":"2025-12-05T15:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.456033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.456111 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.456135 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.456173 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.456197 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:35Z","lastTransitionTime":"2025-12-05T15:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.559988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.560052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.560071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.560094 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.560111 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:35Z","lastTransitionTime":"2025-12-05T15:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.662666 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.662732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.662750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.662809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.662829 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:35Z","lastTransitionTime":"2025-12-05T15:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.766287 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.766350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.766402 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.766428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.766446 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:35Z","lastTransitionTime":"2025-12-05T15:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.870539 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.870633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.870657 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.870689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.870788 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:35Z","lastTransitionTime":"2025-12-05T15:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.973669 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.973724 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.973737 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.973757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:35 crc kubenswrapper[4778]: I1205 15:56:35.973770 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:35Z","lastTransitionTime":"2025-12-05T15:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.076715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.076783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.076802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.076830 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.076849 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.181042 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.181295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.181306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.181320 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.181331 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.248584 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:36 crc kubenswrapper[4778]: E1205 15:56:36.248843 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.278120 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.278185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.278209 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.278243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.278267 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: E1205 15:56:36.299768 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.304892 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.304979 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.304998 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.305025 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.305043 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: E1205 15:56:36.323107 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.327659 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.327711 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.327729 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.327755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.327774 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: E1205 15:56:36.348202 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.356856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.356936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.356962 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.356990 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.357012 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: E1205 15:56:36.377393 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.382102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.382151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.382168 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.382193 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.382213 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: E1205 15:56:36.402562 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65e5e8dd-e6c2-442e-a88f-fb212aa7a9d7\\\",\\\"systemUUID\\\":\\\"d6159bdf-f1e9-405b-9393-3eae3aaf61c7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:36Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:36 crc kubenswrapper[4778]: E1205 15:56:36.402776 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.405038 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.405082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.405099 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.405120 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.405137 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.508462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.508613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.508639 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.508669 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.508688 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.612049 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.612109 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.612127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.612151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.612170 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.715473 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.715545 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.715568 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.715601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.715627 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.819193 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.819266 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.819287 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.819341 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.819400 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.923277 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.923348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.923434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.923466 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:36 crc kubenswrapper[4778]: I1205 15:56:36.923493 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:36Z","lastTransitionTime":"2025-12-05T15:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.025960 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.026025 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.026053 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.026081 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.026101 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:37Z","lastTransitionTime":"2025-12-05T15:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.128677 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.128748 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.128768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.128791 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.128809 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:37Z","lastTransitionTime":"2025-12-05T15:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.231030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.231082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.231098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.231121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.231139 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:37Z","lastTransitionTime":"2025-12-05T15:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.249533 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.249557 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.249578 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:37 crc kubenswrapper[4778]: E1205 15:56:37.249732 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:37 crc kubenswrapper[4778]: E1205 15:56:37.249928 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:37 crc kubenswrapper[4778]: E1205 15:56:37.250401 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.334915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.334963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.334984 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.335014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.335037 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:37Z","lastTransitionTime":"2025-12-05T15:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.437145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.437189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.437201 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.437219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.437234 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:37Z","lastTransitionTime":"2025-12-05T15:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.540442 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.540542 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.540559 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.540613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.540634 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:37Z","lastTransitionTime":"2025-12-05T15:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.644172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.644235 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.644258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.644290 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.644313 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:37Z","lastTransitionTime":"2025-12-05T15:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.747137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.747197 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.747215 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.747243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.747265 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:37Z","lastTransitionTime":"2025-12-05T15:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.849697 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.849775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.849803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.849836 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.849864 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:37Z","lastTransitionTime":"2025-12-05T15:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.953190 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.953251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.953267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.953292 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:37 crc kubenswrapper[4778]: I1205 15:56:37.953315 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:37Z","lastTransitionTime":"2025-12-05T15:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.056450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.056499 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.056511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.056532 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.056547 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:38Z","lastTransitionTime":"2025-12-05T15:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.159281 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.159348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.159384 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.159406 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.159419 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:38Z","lastTransitionTime":"2025-12-05T15:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.248923 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:38 crc kubenswrapper[4778]: E1205 15:56:38.249157 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.262749 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.262816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.262840 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.262865 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.262878 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:38Z","lastTransitionTime":"2025-12-05T15:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.390832 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.390887 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.390900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.390920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.390936 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:38Z","lastTransitionTime":"2025-12-05T15:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.494556 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.494624 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.494640 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.494666 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.494685 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:38Z","lastTransitionTime":"2025-12-05T15:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.598260 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.598326 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.598345 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.598396 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.598414 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:38Z","lastTransitionTime":"2025-12-05T15:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.701611 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.701676 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.701695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.701718 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.701736 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:38Z","lastTransitionTime":"2025-12-05T15:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.805570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.805638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.805655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.805681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.805699 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:38Z","lastTransitionTime":"2025-12-05T15:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.908967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.909054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.909081 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.909113 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:38 crc kubenswrapper[4778]: I1205 15:56:38.909136 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:38Z","lastTransitionTime":"2025-12-05T15:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.013041 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.013138 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.013174 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.013209 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.013233 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:39Z","lastTransitionTime":"2025-12-05T15:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.116258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.116337 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.116444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.116491 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.116521 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:39Z","lastTransitionTime":"2025-12-05T15:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.219899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.219989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.220017 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.220056 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.220159 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:39Z","lastTransitionTime":"2025-12-05T15:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.250185 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:39 crc kubenswrapper[4778]: E1205 15:56:39.250551 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.250621 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.250660 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:39 crc kubenswrapper[4778]: E1205 15:56:39.250980 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:39 crc kubenswrapper[4778]: E1205 15:56:39.251177 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.322782 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.322852 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.322875 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.322905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.322929 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:39Z","lastTransitionTime":"2025-12-05T15:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.425620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.425685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.425709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.425737 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.425759 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:39Z","lastTransitionTime":"2025-12-05T15:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.528636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.528702 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.528724 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.528756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.528777 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:39Z","lastTransitionTime":"2025-12-05T15:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.632261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.632310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.632327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.632353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.632396 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:39Z","lastTransitionTime":"2025-12-05T15:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.735313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.735402 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.735428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.735456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.735481 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:39Z","lastTransitionTime":"2025-12-05T15:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.838036 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.838078 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.838095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.838116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.838131 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:39Z","lastTransitionTime":"2025-12-05T15:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.941079 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.941155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.941172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.941200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:39 crc kubenswrapper[4778]: I1205 15:56:39.941218 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:39Z","lastTransitionTime":"2025-12-05T15:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.043976 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.044029 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.044045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.044068 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.044084 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:40Z","lastTransitionTime":"2025-12-05T15:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.146605 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.146666 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.146689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.146720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.146742 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:40Z","lastTransitionTime":"2025-12-05T15:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.248631 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:40 crc kubenswrapper[4778]: E1205 15:56:40.248858 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.250775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.250831 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.250849 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.250874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.250894 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:40Z","lastTransitionTime":"2025-12-05T15:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.353804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.353881 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.353904 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.353935 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.353965 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:40Z","lastTransitionTime":"2025-12-05T15:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.456938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.456996 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.457014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.457073 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.457093 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:40Z","lastTransitionTime":"2025-12-05T15:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.560471 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.560524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.560541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.560567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.560586 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:40Z","lastTransitionTime":"2025-12-05T15:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.663973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.664038 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.664065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.664112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.664136 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:40Z","lastTransitionTime":"2025-12-05T15:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.767114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.767186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.767211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.767242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.767270 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:40Z","lastTransitionTime":"2025-12-05T15:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.869327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.869381 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.869392 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.869417 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.869429 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:40Z","lastTransitionTime":"2025-12-05T15:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.972403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.972465 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.972474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.972492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:40 crc kubenswrapper[4778]: I1205 15:56:40.972503 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:40Z","lastTransitionTime":"2025-12-05T15:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.075562 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.075605 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.075613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.075629 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.075638 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:41Z","lastTransitionTime":"2025-12-05T15:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.179974 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.180058 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.180081 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.180113 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.180137 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:41Z","lastTransitionTime":"2025-12-05T15:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.249451 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.249504 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:41 crc kubenswrapper[4778]: E1205 15:56:41.249704 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.249837 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:41 crc kubenswrapper[4778]: E1205 15:56:41.249911 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:41 crc kubenswrapper[4778]: E1205 15:56:41.249970 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.282922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.282969 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.282986 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.283006 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.283024 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:41Z","lastTransitionTime":"2025-12-05T15:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.386225 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.386274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.386291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.386317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.386338 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:41Z","lastTransitionTime":"2025-12-05T15:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.490481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.490544 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.490562 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.490589 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.490608 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:41Z","lastTransitionTime":"2025-12-05T15:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.593528 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.593825 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.593981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.594455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.594631 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:41Z","lastTransitionTime":"2025-12-05T15:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.698498 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.699000 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.699211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.699430 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.699640 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:41Z","lastTransitionTime":"2025-12-05T15:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.803711 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.803804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.803829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.803860 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.803883 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:41Z","lastTransitionTime":"2025-12-05T15:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.906247 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.906622 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.906715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.906806 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:41 crc kubenswrapper[4778]: I1205 15:56:41.906916 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:41Z","lastTransitionTime":"2025-12-05T15:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.009249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.009285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.009294 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.009310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.009318 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:42Z","lastTransitionTime":"2025-12-05T15:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.112628 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.112681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.112703 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.112733 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.112756 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:42Z","lastTransitionTime":"2025-12-05T15:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.215778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.215837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.215858 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.215885 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.215904 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:42Z","lastTransitionTime":"2025-12-05T15:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.249460 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:42 crc kubenswrapper[4778]: E1205 15:56:42.249661 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.319270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.319335 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.319358 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.319429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.319454 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:42Z","lastTransitionTime":"2025-12-05T15:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.422503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.422585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.422604 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.423050 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.423106 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:42Z","lastTransitionTime":"2025-12-05T15:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.525780 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.525837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.525856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.525885 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.525903 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:42Z","lastTransitionTime":"2025-12-05T15:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.628557 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.628631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.628647 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.628678 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.628695 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:42Z","lastTransitionTime":"2025-12-05T15:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.732133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.732196 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.732232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.732269 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.732293 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:42Z","lastTransitionTime":"2025-12-05T15:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.836023 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.836230 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.836253 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.836283 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.836305 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:42Z","lastTransitionTime":"2025-12-05T15:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.938826 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.938880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.938903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.938933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:42 crc kubenswrapper[4778]: I1205 15:56:42.938956 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:42Z","lastTransitionTime":"2025-12-05T15:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.041001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.041072 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.041088 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.041115 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.041132 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:43Z","lastTransitionTime":"2025-12-05T15:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.144316 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.144439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.144465 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.144496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.144518 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:43Z","lastTransitionTime":"2025-12-05T15:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.247100 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.248180 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.248218 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.248243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.248256 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:43Z","lastTransitionTime":"2025-12-05T15:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.248488 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:43 crc kubenswrapper[4778]: E1205 15:56:43.248629 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.248684 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.249069 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:43 crc kubenswrapper[4778]: E1205 15:56:43.249284 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.249360 4778 scope.go:117] "RemoveContainer" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" Dec 05 15:56:43 crc kubenswrapper[4778]: E1205 15:56:43.249460 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:43 crc kubenswrapper[4778]: E1205 15:56:43.249826 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.269189 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65be1d8fabfbcc2642cc9940cb9e10f159760d628d68c6b118f98b33b46b9397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.284149 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4770dfb0-d6eb-436d-a657-3539f03c6e0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a242e487a4490b3202ebb43e076048747f313f2b87745967dc8b127048620556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ee70289f02e93ae1317dab5530de4608ec1861205e4dd790a0eab88cd45264d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9j5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p9nwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.306597 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fd281bf-fe41-407b-b13c-5392ccd67b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"ES_128_CBC_SHA256' detected.\\\\nW1205 15:55:31.169539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 15:55:31.169543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 15:55:31.169550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 15:55:31.169556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 15:55:31.169904 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1205 15:55:31.174884 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174947 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1205 15:55:31.174982 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175030 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1205 15:55:31.175043 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1205 15:55:31.175100 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1205 15:55:31.175416 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1205 15:55:31.175447 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1205 15:55:31.176212 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3216376870/tls.crt::/tmp/serving-cert-3216376870/tls.key\\\\\\\"\\\\nF1205 15:55:31.177355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.328790 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.350081 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.350963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.351170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.351189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.351214 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.351231 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:43Z","lastTransitionTime":"2025-12-05T15:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.375803 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:19Z\\\",\\\"message\\\":\\\"2025-12-05T15:55:34+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874\\\\n2025-12-05T15:55:34+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d4b75ba-15d1-4f23-b65e-1a7795fe4874 to /host/opt/cni/bin/\\\\n2025-12-05T15:55:34Z [verbose] multus-daemon started\\\\n2025-12-05T15:55:34Z [verbose] Readiness Indicator file check\\\\n2025-12-05T15:56:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg8xf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.401071 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c63eba1-fb5c-431f-beed-0e81832f7e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e1d9bb0b2737905872b0f739d617797ab932f905503f4c85751b96976f58c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e36d0c1c5319584100a311199165be41987d147dad520b5320e299105e29f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe2b65332ff18b237a55a9530d646eb2ea91d29fd6cede323cada875fdfcad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24699d3cad43653078bf159f84add2ab80667c6bf2e72e0b11a16b85e2579ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfb47fa523b0905da7817ee6a16c196db2ac5df5670d074ab9ebd3b0dad3b108\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099163bf8ccdfd676495caa6f9d0808611d9ebdd9471d92888101dc9a1e68ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d3ebe761390c1b2d702a53b5d747d6502f1ec4eb9cc4ec6671ab5359dd27988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5ggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gvdqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.419684 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48cc0dd1-7387-4df1-aa6a-198ac40c620d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkwnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tvxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.436131 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a8b3b8d4f960a599adbb3017b605bf3cc0ee9acb66794a60e1819c093aaa6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e70f67a757d4bc498d048cb76102e4a01749556294d6504b9c7dcec107191a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.454535 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d49f50cac37ccf48d8c185fef077495dad56d7bca1a7912c5f003dd85014dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6df77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jqrsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.454758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.454802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.454828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.454858 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.455065 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:43Z","lastTransitionTime":"2025-12-05T15:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.471636 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdf51cd-d9a1-4678-8a50-170bcd551c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46dfbfa7cc094d6fe37248fcff273d561a58dfce35ed231a303e4bd70cf0fbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8be3eb2a66916642073568cde0334019c613c661e21540feaeeaba10a686477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf9869dc28a932f93dc17e7251a9439cc0800654b657778a019904dea1065ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17a69e72c45acec77bc2cd6e69c259dd96fd225ceb2ca5d8d593d9e3ee6b120e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.491296 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1c979120a5d1a4cf3c77eae72e8f8687ccea38c5f783a0fe85e9746884cc63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.507480 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.525952 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xm5sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca6b2f2c-9429-4e93-b595-38d5ac9e0d57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e7da04e86d73919faf3c4d47c882490b6acdfd2401df7cbc7fe2f0ef869d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7j8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xm5sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.549128 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6837b168-c691-4e7e-a211-a0c8ef0534e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T15:56:29Z\\\",\\\"message\\\":\\\".333395 6771 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:29.333396 6771 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 15:56:29.334516 6771 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 15:56:29.334606 6771 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 15:56:29.334623 6771 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 15:56:29.334632 6771 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 15:56:29.334645 6771 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 15:56:29.334645 6771 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 15:56:29.334654 6771 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 15:56:29.334689 6771 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 15:56:29.334699 6771 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 15:56:29.334730 6771 factory.go:656] Stopping watch factory\\\\nI1205 15:56:29.334753 6771 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T15:56:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vzs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.559309 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.559408 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.559433 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.559466 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.559491 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:43Z","lastTransitionTime":"2025-12-05T15:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.562452 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w67tn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2998e77d-eac9-4670-8527-5cdba406e819\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22a72852474f0604e888285c50e8ce171bef208a6fe47d31f84ef0f460153eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t9p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w67tn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.580787 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34c59f2c-f7c7-4250-aa1a-fc66c12f82d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed84079611e8019b47eac42eb6ab0d6fec58223dc74eb07cc8ab3adb114ef5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9da68f66d143ecd7508ac0ac689cdd56b0e3d2b3dec8b1a590fef5deba58d61c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cd33d8e355cdbbfaa5d5f9ff118f34d7237f04888c89617a622025896563f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.611624 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97d68f98-3d73-4bcf-a5c6-6855cb84aaeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T15:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687f7e336ec718a19ea08d02c126dcfeb95f167967b744ee93c93d0224f53925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://961dbc28c47d9fdef28a647027d938f565d39bca4a99155e9a25c327d989b481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://524574bea6f4810738aea6346e448f9428bb201795e1abd1aae2148c421c3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3c7b10e17ef1d8c5dc44e10ed4941d15a294e895f87982b0e40aeb99fc129c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1316e02ebcc5f6d7ee8f03302d2bed3793a1848bcdc06906af6af15a3cc6827\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T15:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77c8f2d4c3e014622c701c4b8bf798f8c5fca04fd13004049068a695e496cb03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6980fe8f9531923bc3afbf526f05cf16e79d617f8840f342776c2da6c76153b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e9bf52e6f4f69d4681e474b26e04abb26c0c7cb55ffa8baffcabb160c8a8530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T15:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T15:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T15:55:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T15:56:43Z is after 2025-08-24T17:21:41Z" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.662059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.662094 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.662105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.662122 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.662134 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:43Z","lastTransitionTime":"2025-12-05T15:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.764033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.764092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.764109 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.764246 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.764316 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:43Z","lastTransitionTime":"2025-12-05T15:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.867147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.867186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.867198 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.867215 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.867230 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:43Z","lastTransitionTime":"2025-12-05T15:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.970200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.970252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.970268 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.970291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:43 crc kubenswrapper[4778]: I1205 15:56:43.970307 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:43Z","lastTransitionTime":"2025-12-05T15:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.072852 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.072905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.072922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.072947 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.072966 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:44Z","lastTransitionTime":"2025-12-05T15:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.175843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.175912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.175936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.175969 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.175992 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:44Z","lastTransitionTime":"2025-12-05T15:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.248971 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:44 crc kubenswrapper[4778]: E1205 15:56:44.249173 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.279494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.279572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.279594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.279616 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.279635 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:44Z","lastTransitionTime":"2025-12-05T15:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.383499 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.383574 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.383598 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.383623 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.383640 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:44Z","lastTransitionTime":"2025-12-05T15:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.486644 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.486697 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.486714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.486738 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.486757 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:44Z","lastTransitionTime":"2025-12-05T15:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.590316 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.590422 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.590448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.590480 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.590505 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:44Z","lastTransitionTime":"2025-12-05T15:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.693589 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.693675 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.693699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.693731 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.693754 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:44Z","lastTransitionTime":"2025-12-05T15:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.796903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.796978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.797003 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.797035 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.797064 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:44Z","lastTransitionTime":"2025-12-05T15:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.900503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.900558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.900569 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.900591 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:44 crc kubenswrapper[4778]: I1205 15:56:44.900605 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:44Z","lastTransitionTime":"2025-12-05T15:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.003910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.003980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.004002 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.004032 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.004051 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:45Z","lastTransitionTime":"2025-12-05T15:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.107223 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.107703 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.107887 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.108118 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.108305 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:45Z","lastTransitionTime":"2025-12-05T15:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.212401 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.212500 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.212532 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.212564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.212590 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:45Z","lastTransitionTime":"2025-12-05T15:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.249162 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.249686 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.249734 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:45 crc kubenswrapper[4778]: E1205 15:56:45.250067 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:45 crc kubenswrapper[4778]: E1205 15:56:45.250913 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:45 crc kubenswrapper[4778]: E1205 15:56:45.251095 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.268504 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.316658 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.316765 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.316790 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.316825 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.316848 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:45Z","lastTransitionTime":"2025-12-05T15:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.420346 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.420446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.420465 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.420496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.420514 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:45Z","lastTransitionTime":"2025-12-05T15:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.524173 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.524255 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.524280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.524309 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.524348 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:45Z","lastTransitionTime":"2025-12-05T15:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.628032 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.628102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.628123 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.628147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.628168 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:45Z","lastTransitionTime":"2025-12-05T15:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.731144 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.731276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.731299 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.731324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.731344 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:45Z","lastTransitionTime":"2025-12-05T15:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.833966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.834029 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.834046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.834071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.834090 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:45Z","lastTransitionTime":"2025-12-05T15:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.937456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.937517 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.937535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.937560 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:45 crc kubenswrapper[4778]: I1205 15:56:45.937581 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:45Z","lastTransitionTime":"2025-12-05T15:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.040337 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.040450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.040483 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.040515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.040537 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:46Z","lastTransitionTime":"2025-12-05T15:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.143250 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.143356 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.143424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.143451 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.143469 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:46Z","lastTransitionTime":"2025-12-05T15:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.246308 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.246397 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.246427 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.246458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.246480 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:46Z","lastTransitionTime":"2025-12-05T15:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.249083 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:46 crc kubenswrapper[4778]: E1205 15:56:46.249446 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.349899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.349973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.349992 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.350018 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.350037 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:46Z","lastTransitionTime":"2025-12-05T15:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.444185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.444399 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.444432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.444460 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.444479 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T15:56:46Z","lastTransitionTime":"2025-12-05T15:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.508610 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999"] Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.509208 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.511762 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.512183 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.512535 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.512957 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.554857 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.554828948 podStartE2EDuration="1.554828948s" podCreationTimestamp="2025-12-05 15:56:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:56:46.554199721 +0000 UTC m=+93.657996141" watchObservedRunningTime="2025-12-05 15:56:46.554828948 +0000 UTC m=+93.658625368" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.555068 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.555059074 podStartE2EDuration="1m15.555059074s" podCreationTimestamp="2025-12-05 15:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:56:46.540051464 +0000 UTC m=+93.643847884" watchObservedRunningTime="2025-12-05 15:56:46.555059074 +0000 UTC m=+93.658855494" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.581282 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/43f03600-5968-4a47-b068-767dcf4ee6f3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.581520 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/43f03600-5968-4a47-b068-767dcf4ee6f3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.581574 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43f03600-5968-4a47-b068-767dcf4ee6f3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.581598 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f03600-5968-4a47-b068-767dcf4ee6f3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.581617 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43f03600-5968-4a47-b068-767dcf4ee6f3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.589120 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=72.589101506 podStartE2EDuration="1m12.589101506s" podCreationTimestamp="2025-12-05 15:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:56:46.586027761 +0000 UTC m=+93.689824141" watchObservedRunningTime="2025-12-05 15:56:46.589101506 +0000 UTC m=+93.692897886" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.620123 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xm5sq" podStartSLOduration=74.620102963 podStartE2EDuration="1m14.620102963s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:56:46.600448466 +0000 UTC m=+93.704244856" watchObservedRunningTime="2025-12-05 15:56:46.620102963 +0000 UTC m=+93.723899343" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.631251 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-w67tn" podStartSLOduration=74.631225397 podStartE2EDuration="1m14.631225397s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:56:46.630393364 +0000 UTC m=+93.734189754" watchObservedRunningTime="2025-12-05 15:56:46.631225397 +0000 UTC m=+93.735021817" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.650936 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=75.650913235 podStartE2EDuration="1m15.650913235s" podCreationTimestamp="2025-12-05 15:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:56:46.650647488 +0000 UTC m=+93.754443878" watchObservedRunningTime="2025-12-05 15:56:46.650913235 +0000 UTC m=+93.754709635" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.682893 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/43f03600-5968-4a47-b068-767dcf4ee6f3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.682957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/43f03600-5968-4a47-b068-767dcf4ee6f3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.683011 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43f03600-5968-4a47-b068-767dcf4ee6f3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.683038 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/43f03600-5968-4a47-b068-767dcf4ee6f3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.683045 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f03600-5968-4a47-b068-767dcf4ee6f3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.683088 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43f03600-5968-4a47-b068-767dcf4ee6f3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.683015 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/43f03600-5968-4a47-b068-767dcf4ee6f3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.683894 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43f03600-5968-4a47-b068-767dcf4ee6f3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.695820 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f03600-5968-4a47-b068-767dcf4ee6f3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.705311 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43f03600-5968-4a47-b068-767dcf4ee6f3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gx999\" (UID: \"43f03600-5968-4a47-b068-767dcf4ee6f3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.710184 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p9nwv" podStartSLOduration=73.710153614 podStartE2EDuration="1m13.710153614s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:56:46.709411774 +0000 UTC m=+93.813208204" watchObservedRunningTime="2025-12-05 15:56:46.710153614 +0000 UTC m=+93.813950024" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.727466 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nrqmz" podStartSLOduration=74.727438897 podStartE2EDuration="1m14.727438897s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:56:46.726235844 +0000 UTC m=+93.830032224" watchObservedRunningTime="2025-12-05 15:56:46.727438897 +0000 UTC m=+93.831235307" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.743241 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gvdqh" podStartSLOduration=74.743220399 podStartE2EDuration="1m14.743220399s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:56:46.743018513 +0000 UTC m=+93.846814913" watchObservedRunningTime="2025-12-05 15:56:46.743220399 +0000 UTC m=+93.847016779" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.771462 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.77143222 podStartE2EDuration="39.77143222s" podCreationTimestamp="2025-12-05 15:56:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:56:46.770464183 +0000 UTC m=+93.874260553" watchObservedRunningTime="2025-12-05 15:56:46.77143222 +0000 UTC m=+93.875228640" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.826814 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.836942 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podStartSLOduration=74.836920619 podStartE2EDuration="1m14.836920619s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:56:46.836266492 +0000 UTC m=+93.940062872" watchObservedRunningTime="2025-12-05 15:56:46.836920619 +0000 UTC m=+93.940717009" Dec 05 15:56:46 crc kubenswrapper[4778]: W1205 15:56:46.842065 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43f03600_5968_4a47_b068_767dcf4ee6f3.slice/crio-8f501c1a8d256b899bdfa268617fd17f5d32e43b4c8a9b25afceabf0a7f4279d WatchSource:0}: Error finding container 8f501c1a8d256b899bdfa268617fd17f5d32e43b4c8a9b25afceabf0a7f4279d: Status 404 returned error can't find the container with id 8f501c1a8d256b899bdfa268617fd17f5d32e43b4c8a9b25afceabf0a7f4279d Dec 05 15:56:46 crc kubenswrapper[4778]: I1205 15:56:46.861110 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" event={"ID":"43f03600-5968-4a47-b068-767dcf4ee6f3","Type":"ContainerStarted","Data":"8f501c1a8d256b899bdfa268617fd17f5d32e43b4c8a9b25afceabf0a7f4279d"} Dec 05 15:56:47 crc kubenswrapper[4778]: I1205 15:56:47.249583 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:47 crc kubenswrapper[4778]: I1205 15:56:47.249690 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:47 crc kubenswrapper[4778]: E1205 15:56:47.249825 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:47 crc kubenswrapper[4778]: E1205 15:56:47.249908 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:47 crc kubenswrapper[4778]: I1205 15:56:47.250359 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:47 crc kubenswrapper[4778]: E1205 15:56:47.250657 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:47 crc kubenswrapper[4778]: I1205 15:56:47.865395 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" event={"ID":"43f03600-5968-4a47-b068-767dcf4ee6f3","Type":"ContainerStarted","Data":"43db83dd50953320d4fcc8a05f1719609a1d4826a4f353a9b54109084f56d9af"} Dec 05 15:56:47 crc kubenswrapper[4778]: I1205 15:56:47.886319 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gx999" podStartSLOduration=75.886287894 podStartE2EDuration="1m15.886287894s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:56:47.884765952 +0000 UTC m=+94.988562392" watchObservedRunningTime="2025-12-05 15:56:47.886287894 +0000 UTC m=+94.990084304" Dec 05 15:56:48 crc kubenswrapper[4778]: I1205 15:56:48.249083 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:48 crc kubenswrapper[4778]: E1205 15:56:48.249219 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:49 crc kubenswrapper[4778]: I1205 15:56:49.248599 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:49 crc kubenswrapper[4778]: I1205 15:56:49.248654 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:49 crc kubenswrapper[4778]: E1205 15:56:49.248820 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:49 crc kubenswrapper[4778]: E1205 15:56:49.248952 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:49 crc kubenswrapper[4778]: I1205 15:56:49.249539 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:49 crc kubenswrapper[4778]: E1205 15:56:49.249746 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:50 crc kubenswrapper[4778]: I1205 15:56:50.249092 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:50 crc kubenswrapper[4778]: E1205 15:56:50.249681 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:51 crc kubenswrapper[4778]: I1205 15:56:51.249476 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:51 crc kubenswrapper[4778]: I1205 15:56:51.249535 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:51 crc kubenswrapper[4778]: I1205 15:56:51.249602 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:51 crc kubenswrapper[4778]: E1205 15:56:51.249701 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:51 crc kubenswrapper[4778]: E1205 15:56:51.249826 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:51 crc kubenswrapper[4778]: E1205 15:56:51.249927 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:51 crc kubenswrapper[4778]: I1205 15:56:51.333534 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:51 crc kubenswrapper[4778]: E1205 15:56:51.333844 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:56:51 crc kubenswrapper[4778]: E1205 15:56:51.333970 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs podName:48cc0dd1-7387-4df1-aa6a-198ac40c620d nodeName:}" failed. No retries permitted until 2025-12-05 15:57:55.333940175 +0000 UTC m=+162.437736585 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs") pod "network-metrics-daemon-8tvxd" (UID: "48cc0dd1-7387-4df1-aa6a-198ac40c620d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 15:56:52 crc kubenswrapper[4778]: I1205 15:56:52.248968 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:52 crc kubenswrapper[4778]: E1205 15:56:52.249130 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:53 crc kubenswrapper[4778]: I1205 15:56:53.249659 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:53 crc kubenswrapper[4778]: I1205 15:56:53.249705 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:53 crc kubenswrapper[4778]: E1205 15:56:53.251884 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:53 crc kubenswrapper[4778]: I1205 15:56:53.251993 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:53 crc kubenswrapper[4778]: E1205 15:56:53.252326 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:53 crc kubenswrapper[4778]: E1205 15:56:53.252259 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:54 crc kubenswrapper[4778]: I1205 15:56:54.248822 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:54 crc kubenswrapper[4778]: E1205 15:56:54.249011 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:55 crc kubenswrapper[4778]: I1205 15:56:55.249282 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:55 crc kubenswrapper[4778]: I1205 15:56:55.249319 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:55 crc kubenswrapper[4778]: I1205 15:56:55.249443 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:55 crc kubenswrapper[4778]: E1205 15:56:55.249647 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:55 crc kubenswrapper[4778]: E1205 15:56:55.249835 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:55 crc kubenswrapper[4778]: E1205 15:56:55.249944 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:56 crc kubenswrapper[4778]: I1205 15:56:56.249728 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:56 crc kubenswrapper[4778]: E1205 15:56:56.250376 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:56 crc kubenswrapper[4778]: I1205 15:56:56.251076 4778 scope.go:117] "RemoveContainer" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" Dec 05 15:56:56 crc kubenswrapper[4778]: E1205 15:56:56.251532 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" Dec 05 15:56:57 crc kubenswrapper[4778]: I1205 15:56:57.249094 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:57 crc kubenswrapper[4778]: I1205 15:56:57.249110 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:57 crc kubenswrapper[4778]: I1205 15:56:57.249110 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:57 crc kubenswrapper[4778]: E1205 15:56:57.249293 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:57 crc kubenswrapper[4778]: E1205 15:56:57.249449 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:56:57 crc kubenswrapper[4778]: E1205 15:56:57.249670 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:58 crc kubenswrapper[4778]: I1205 15:56:58.249174 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:56:58 crc kubenswrapper[4778]: E1205 15:56:58.249531 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:56:59 crc kubenswrapper[4778]: I1205 15:56:59.249494 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:56:59 crc kubenswrapper[4778]: I1205 15:56:59.249494 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:56:59 crc kubenswrapper[4778]: I1205 15:56:59.249825 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:56:59 crc kubenswrapper[4778]: E1205 15:56:59.250013 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:56:59 crc kubenswrapper[4778]: E1205 15:56:59.250285 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:56:59 crc kubenswrapper[4778]: E1205 15:56:59.250503 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:00 crc kubenswrapper[4778]: I1205 15:57:00.248789 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:00 crc kubenswrapper[4778]: E1205 15:57:00.248939 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:01 crc kubenswrapper[4778]: I1205 15:57:01.249458 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:01 crc kubenswrapper[4778]: I1205 15:57:01.249480 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:01 crc kubenswrapper[4778]: E1205 15:57:01.249713 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:01 crc kubenswrapper[4778]: E1205 15:57:01.249816 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:01 crc kubenswrapper[4778]: I1205 15:57:01.249484 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:01 crc kubenswrapper[4778]: E1205 15:57:01.249959 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:02 crc kubenswrapper[4778]: I1205 15:57:02.248566 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:02 crc kubenswrapper[4778]: E1205 15:57:02.248754 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:03 crc kubenswrapper[4778]: I1205 15:57:03.249551 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:03 crc kubenswrapper[4778]: I1205 15:57:03.249577 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:03 crc kubenswrapper[4778]: I1205 15:57:03.249644 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:03 crc kubenswrapper[4778]: E1205 15:57:03.251525 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:03 crc kubenswrapper[4778]: E1205 15:57:03.251823 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:03 crc kubenswrapper[4778]: E1205 15:57:03.251965 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:04 crc kubenswrapper[4778]: I1205 15:57:04.249327 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:04 crc kubenswrapper[4778]: E1205 15:57:04.249589 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:05 crc kubenswrapper[4778]: I1205 15:57:05.248746 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:05 crc kubenswrapper[4778]: E1205 15:57:05.249258 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:05 crc kubenswrapper[4778]: I1205 15:57:05.248885 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:05 crc kubenswrapper[4778]: I1205 15:57:05.248768 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:05 crc kubenswrapper[4778]: E1205 15:57:05.250015 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:05 crc kubenswrapper[4778]: E1205 15:57:05.250951 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:05 crc kubenswrapper[4778]: I1205 15:57:05.937487 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrqmz_9b26d99a-f08e-41d1-b35c-5da99cbe3fb4/kube-multus/1.log" Dec 05 15:57:05 crc kubenswrapper[4778]: I1205 15:57:05.938170 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrqmz_9b26d99a-f08e-41d1-b35c-5da99cbe3fb4/kube-multus/0.log" Dec 05 15:57:05 crc kubenswrapper[4778]: I1205 15:57:05.938250 4778 generic.go:334] "Generic (PLEG): container finished" podID="9b26d99a-f08e-41d1-b35c-5da99cbe3fb4" containerID="14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3" exitCode=1 Dec 05 15:57:05 crc kubenswrapper[4778]: I1205 15:57:05.938305 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrqmz" event={"ID":"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4","Type":"ContainerDied","Data":"14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3"} Dec 05 15:57:05 crc kubenswrapper[4778]: I1205 15:57:05.938355 4778 scope.go:117] "RemoveContainer" containerID="c03c30745d6d90feabd702887bfb6d43d8fc5db0eca4ce2cf403029c673de26b" Dec 05 15:57:05 crc kubenswrapper[4778]: I1205 15:57:05.939070 4778 scope.go:117] "RemoveContainer" containerID="14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3" Dec 05 15:57:05 crc kubenswrapper[4778]: E1205 15:57:05.939471 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-nrqmz_openshift-multus(9b26d99a-f08e-41d1-b35c-5da99cbe3fb4)\"" pod="openshift-multus/multus-nrqmz" podUID="9b26d99a-f08e-41d1-b35c-5da99cbe3fb4" Dec 05 15:57:06 crc kubenswrapper[4778]: I1205 15:57:06.248774 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:06 crc kubenswrapper[4778]: E1205 15:57:06.249743 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:06 crc kubenswrapper[4778]: I1205 15:57:06.943596 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrqmz_9b26d99a-f08e-41d1-b35c-5da99cbe3fb4/kube-multus/1.log" Dec 05 15:57:07 crc kubenswrapper[4778]: I1205 15:57:07.248712 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:07 crc kubenswrapper[4778]: I1205 15:57:07.248739 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:07 crc kubenswrapper[4778]: E1205 15:57:07.248985 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:07 crc kubenswrapper[4778]: I1205 15:57:07.249111 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:07 crc kubenswrapper[4778]: E1205 15:57:07.249244 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:07 crc kubenswrapper[4778]: E1205 15:57:07.249426 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:08 crc kubenswrapper[4778]: I1205 15:57:08.248544 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:08 crc kubenswrapper[4778]: E1205 15:57:08.248732 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:09 crc kubenswrapper[4778]: I1205 15:57:09.248549 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:09 crc kubenswrapper[4778]: I1205 15:57:09.248594 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:09 crc kubenswrapper[4778]: E1205 15:57:09.248842 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:09 crc kubenswrapper[4778]: I1205 15:57:09.248935 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:09 crc kubenswrapper[4778]: E1205 15:57:09.249062 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:09 crc kubenswrapper[4778]: E1205 15:57:09.249852 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:09 crc kubenswrapper[4778]: I1205 15:57:09.250435 4778 scope.go:117] "RemoveContainer" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" Dec 05 15:57:09 crc kubenswrapper[4778]: E1205 15:57:09.250752 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vzs5q_openshift-ovn-kubernetes(6837b168-c691-4e7e-a211-a0c8ef0534e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" Dec 05 15:57:10 crc kubenswrapper[4778]: I1205 15:57:10.248502 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:10 crc kubenswrapper[4778]: E1205 15:57:10.248734 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:11 crc kubenswrapper[4778]: I1205 15:57:11.248668 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:11 crc kubenswrapper[4778]: I1205 15:57:11.248753 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:11 crc kubenswrapper[4778]: I1205 15:57:11.248810 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:11 crc kubenswrapper[4778]: E1205 15:57:11.248967 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:11 crc kubenswrapper[4778]: E1205 15:57:11.249302 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:11 crc kubenswrapper[4778]: E1205 15:57:11.249508 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:12 crc kubenswrapper[4778]: I1205 15:57:12.249188 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:12 crc kubenswrapper[4778]: E1205 15:57:12.249439 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:13 crc kubenswrapper[4778]: E1205 15:57:13.207724 4778 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 05 15:57:13 crc kubenswrapper[4778]: I1205 15:57:13.249225 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:13 crc kubenswrapper[4778]: I1205 15:57:13.249831 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:13 crc kubenswrapper[4778]: E1205 15:57:13.251488 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:13 crc kubenswrapper[4778]: I1205 15:57:13.251524 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:13 crc kubenswrapper[4778]: E1205 15:57:13.251686 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:13 crc kubenswrapper[4778]: E1205 15:57:13.251833 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:13 crc kubenswrapper[4778]: E1205 15:57:13.327589 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 15:57:14 crc kubenswrapper[4778]: I1205 15:57:14.248948 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:14 crc kubenswrapper[4778]: E1205 15:57:14.249139 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:15 crc kubenswrapper[4778]: I1205 15:57:15.249088 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:15 crc kubenswrapper[4778]: I1205 15:57:15.249132 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:15 crc kubenswrapper[4778]: E1205 15:57:15.249312 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:15 crc kubenswrapper[4778]: E1205 15:57:15.249554 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:15 crc kubenswrapper[4778]: I1205 15:57:15.250026 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:15 crc kubenswrapper[4778]: E1205 15:57:15.250193 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:16 crc kubenswrapper[4778]: I1205 15:57:16.249567 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:16 crc kubenswrapper[4778]: E1205 15:57:16.249800 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:17 crc kubenswrapper[4778]: I1205 15:57:17.249266 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:17 crc kubenswrapper[4778]: I1205 15:57:17.249421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:17 crc kubenswrapper[4778]: I1205 15:57:17.249281 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:17 crc kubenswrapper[4778]: E1205 15:57:17.249595 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:17 crc kubenswrapper[4778]: E1205 15:57:17.249784 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:17 crc kubenswrapper[4778]: E1205 15:57:17.249905 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:18 crc kubenswrapper[4778]: I1205 15:57:18.249077 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:18 crc kubenswrapper[4778]: E1205 15:57:18.249239 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:18 crc kubenswrapper[4778]: E1205 15:57:18.328722 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 15:57:19 crc kubenswrapper[4778]: I1205 15:57:19.248861 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:19 crc kubenswrapper[4778]: E1205 15:57:19.248967 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:19 crc kubenswrapper[4778]: I1205 15:57:19.249122 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:19 crc kubenswrapper[4778]: E1205 15:57:19.249171 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:19 crc kubenswrapper[4778]: I1205 15:57:19.249359 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:19 crc kubenswrapper[4778]: E1205 15:57:19.249457 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:20 crc kubenswrapper[4778]: I1205 15:57:20.248958 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:20 crc kubenswrapper[4778]: E1205 15:57:20.249198 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:21 crc kubenswrapper[4778]: I1205 15:57:21.248745 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:21 crc kubenswrapper[4778]: I1205 15:57:21.248777 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:21 crc kubenswrapper[4778]: E1205 15:57:21.249056 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:21 crc kubenswrapper[4778]: I1205 15:57:21.249141 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:21 crc kubenswrapper[4778]: I1205 15:57:21.249275 4778 scope.go:117] "RemoveContainer" containerID="14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3" Dec 05 15:57:21 crc kubenswrapper[4778]: E1205 15:57:21.249614 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:21 crc kubenswrapper[4778]: E1205 15:57:21.249773 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:22 crc kubenswrapper[4778]: I1205 15:57:22.001952 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrqmz_9b26d99a-f08e-41d1-b35c-5da99cbe3fb4/kube-multus/1.log" Dec 05 15:57:22 crc kubenswrapper[4778]: I1205 15:57:22.002298 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrqmz" event={"ID":"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4","Type":"ContainerStarted","Data":"755a407d01c202dcc90aac7a00034bea66f43c3d38847030c371b6abe04a171b"} Dec 05 15:57:22 crc kubenswrapper[4778]: I1205 15:57:22.248752 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:22 crc kubenswrapper[4778]: E1205 15:57:22.248955 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:23 crc kubenswrapper[4778]: I1205 15:57:23.249743 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:23 crc kubenswrapper[4778]: I1205 15:57:23.249902 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:23 crc kubenswrapper[4778]: E1205 15:57:23.252132 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:23 crc kubenswrapper[4778]: I1205 15:57:23.252184 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:23 crc kubenswrapper[4778]: E1205 15:57:23.252891 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:23 crc kubenswrapper[4778]: E1205 15:57:23.253064 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:23 crc kubenswrapper[4778]: I1205 15:57:23.253470 4778 scope.go:117] "RemoveContainer" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" Dec 05 15:57:23 crc kubenswrapper[4778]: E1205 15:57:23.330675 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 15:57:24 crc kubenswrapper[4778]: I1205 15:57:24.011650 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/3.log" Dec 05 15:57:24 crc kubenswrapper[4778]: I1205 15:57:24.015851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerStarted","Data":"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5"} Dec 05 15:57:24 crc kubenswrapper[4778]: I1205 15:57:24.016481 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:57:24 crc kubenswrapper[4778]: I1205 15:57:24.042153 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podStartSLOduration=112.042131563 podStartE2EDuration="1m52.042131563s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:24.041205178 +0000 UTC m=+131.145001558" watchObservedRunningTime="2025-12-05 15:57:24.042131563 +0000 UTC m=+131.145927943" Dec 05 15:57:24 crc kubenswrapper[4778]: I1205 15:57:24.207727 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8tvxd"] Dec 05 15:57:24 crc kubenswrapper[4778]: I1205 15:57:24.207863 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:24 crc kubenswrapper[4778]: E1205 15:57:24.207973 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:24 crc kubenswrapper[4778]: I1205 15:57:24.250291 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:24 crc kubenswrapper[4778]: E1205 15:57:24.250484 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:25 crc kubenswrapper[4778]: I1205 15:57:25.249749 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:25 crc kubenswrapper[4778]: I1205 15:57:25.249856 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:25 crc kubenswrapper[4778]: E1205 15:57:25.250442 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:25 crc kubenswrapper[4778]: I1205 15:57:25.249857 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:25 crc kubenswrapper[4778]: E1205 15:57:25.250859 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:25 crc kubenswrapper[4778]: E1205 15:57:25.251016 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:26 crc kubenswrapper[4778]: I1205 15:57:26.249441 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:26 crc kubenswrapper[4778]: E1205 15:57:26.249974 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:27 crc kubenswrapper[4778]: I1205 15:57:27.249336 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:27 crc kubenswrapper[4778]: I1205 15:57:27.249469 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:27 crc kubenswrapper[4778]: E1205 15:57:27.249578 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 15:57:27 crc kubenswrapper[4778]: E1205 15:57:27.249773 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tvxd" podUID="48cc0dd1-7387-4df1-aa6a-198ac40c620d" Dec 05 15:57:27 crc kubenswrapper[4778]: I1205 15:57:27.249959 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:27 crc kubenswrapper[4778]: E1205 15:57:27.250058 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 15:57:28 crc kubenswrapper[4778]: I1205 15:57:28.249526 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:28 crc kubenswrapper[4778]: E1205 15:57:28.249731 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 15:57:29 crc kubenswrapper[4778]: I1205 15:57:29.249547 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:29 crc kubenswrapper[4778]: I1205 15:57:29.249585 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:29 crc kubenswrapper[4778]: I1205 15:57:29.250665 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:29 crc kubenswrapper[4778]: I1205 15:57:29.254135 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 15:57:29 crc kubenswrapper[4778]: I1205 15:57:29.254329 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 15:57:29 crc kubenswrapper[4778]: I1205 15:57:29.254517 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 15:57:29 crc kubenswrapper[4778]: I1205 15:57:29.254569 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 15:57:29 crc kubenswrapper[4778]: I1205 15:57:29.254976 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 15:57:29 crc kubenswrapper[4778]: I1205 15:57:29.255068 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 15:57:30 crc kubenswrapper[4778]: I1205 15:57:30.248776 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:33 crc kubenswrapper[4778]: I1205 15:57:33.415237 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 15:57:33 crc kubenswrapper[4778]: I1205 15:57:33.415326 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.533414 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.579820 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.581040 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-smxt5"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.581200 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.581873 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mw82h"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.582127 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-smxt5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.583202 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mw82h" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.586480 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.586555 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.587709 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.587764 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.588165 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.588438 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.588728 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.589563 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.589934 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.589953 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.590502 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.607479 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.607607 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.607812 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.607952 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.608173 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.608235 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.608463 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.612168 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.613280 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.616002 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.617287 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-27smd"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.618116 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.618436 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4sh2x"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.619171 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.619269 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffqnf"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.619807 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.621124 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.621573 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.625459 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.626247 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-gnnls"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.626739 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-vf689"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.627420 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.627942 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.628280 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.628866 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.629579 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.634462 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kxmn5"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.635137 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ln57b"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.635663 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.635732 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.646630 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.646865 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.649424 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.649916 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.650215 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.650636 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.650943 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.651211 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.651464 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.651532 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.651629 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.651980 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.652313 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.652494 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.652643 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.652706 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.654794 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.654993 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.655028 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.655087 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.655181 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.655242 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.655427 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.655720 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.655786 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.655974 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.655999 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.656008 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.656168 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.656296 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.656444 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.656574 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.656730 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.656860 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.657182 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.657254 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.657610 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.657794 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.657938 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.658863 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.658966 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.659075 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.659172 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.659402 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.666194 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.670916 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.675597 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.678803 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.679179 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.679393 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.679685 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.682929 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.686002 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.689937 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.690340 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.676292 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-zzpst"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.691689 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.692067 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.692129 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jfv5q"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.692638 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.693458 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ptlw8"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.694152 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.694457 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.694909 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.695511 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.741001 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.742767 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ndj72"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.743476 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.743998 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ndj72" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744188 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744011 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744046 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744068 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/782cf494-3079-47fe-8f6c-f7d5731a5b69-config\") pod \"route-controller-manager-6576b87f9c-6jmzh\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744475 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rbqx\" (UniqueName: \"kubernetes.io/projected/782cf494-3079-47fe-8f6c-f7d5731a5b69-kube-api-access-5rbqx\") pod \"route-controller-manager-6576b87f9c-6jmzh\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744502 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbrlh\" (UniqueName: \"kubernetes.io/projected/21e078d9-a539-4626-b30f-908b8e866a7a-kube-api-access-gbrlh\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744531 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744551 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfbfb374-e786-4e49-8c80-54ec12c7abcc-serving-cert\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744571 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744601 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f36385-887c-4459-9c97-b1fb8f8d1d26-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-psmvp\" (UID: \"26f36385-887c-4459-9c97-b1fb8f8d1d26\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744637 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744729 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744793 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-service-ca\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744822 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwxv5\" (UniqueName: \"kubernetes.io/projected/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-kube-api-access-qwxv5\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744859 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-image-import-ca\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744887 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ea62f2-f338-4164-8ecc-3d7d777c0d43-config\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745020 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/782cf494-3079-47fe-8f6c-f7d5731a5b69-client-ca\") pod \"route-controller-manager-6576b87f9c-6jmzh\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745090 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-audit-policies\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745156 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/782cf494-3079-47fe-8f6c-f7d5731a5b69-serving-cert\") pod \"route-controller-manager-6576b87f9c-6jmzh\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745178 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f859bdf8-f651-407a-a6b8-6c3ae2fe7f63-config\") pod \"machine-api-operator-5694c8668f-27smd\" (UID: \"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745193 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c410dc-98d6-4319-9bae-e4025e9fdbb5-config\") pod \"kube-apiserver-operator-766d6c64bb-znzwx\" (UID: \"21c410dc-98d6-4319-9bae-e4025e9fdbb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745212 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10a607f0-e1ef-405d-9771-54076793d426-serving-cert\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745227 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg8g8\" (UniqueName: \"kubernetes.io/projected/11591472-458c-43dc-b51b-2b15987291a0-kube-api-access-kg8g8\") pod \"dns-operator-744455d44c-mw82h\" (UID: \"11591472-458c-43dc-b51b-2b15987291a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-mw82h" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745243 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21e078d9-a539-4626-b30f-908b8e866a7a-audit-dir\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745259 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-config\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745274 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-encryption-config\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745289 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-config\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745311 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50ea62f2-f338-4164-8ecc-3d7d777c0d43-serving-cert\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745327 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745342 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cfbfb374-e786-4e49-8c80-54ec12c7abcc-node-pullsecrets\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745356 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffn5l\" (UID: \"9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745227 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745431 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745230 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745560 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2c52\" (UniqueName: \"kubernetes.io/projected/cfbfb374-e786-4e49-8c80-54ec12c7abcc-kube-api-access-c2c52\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745579 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745587 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfl8h\" (UniqueName: \"kubernetes.io/projected/b8d6b1ed-75bd-4c5a-ab2d-294b7359301d-kube-api-access-mfl8h\") pod \"openshift-config-operator-7777fb866f-5bl2q\" (UID: \"b8d6b1ed-75bd-4c5a-ab2d-294b7359301d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745611 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-serving-cert\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745302 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745641 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f859bdf8-f651-407a-a6b8-6c3ae2fe7f63-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-27smd\" (UID: \"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745662 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745377 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745700 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745737 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745740 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745752 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745771 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-trusted-ca-bundle\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745774 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745860 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffn5l\" (UID: \"9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745893 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5snd\" (UniqueName: \"kubernetes.io/projected/9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61-kube-api-access-m5snd\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffn5l\" (UID: \"9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.745979 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b8d6b1ed-75bd-4c5a-ab2d-294b7359301d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5bl2q\" (UID: \"b8d6b1ed-75bd-4c5a-ab2d-294b7359301d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746029 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41c5c97b-d54b-4770-bc40-af3149d25304-auth-proxy-config\") pod \"machine-approver-56656f9798-vf689\" (UID: \"41c5c97b-d54b-4770-bc40-af3149d25304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746055 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c5c97b-d54b-4770-bc40-af3149d25304-config\") pod \"machine-approver-56656f9798-vf689\" (UID: \"41c5c97b-d54b-4770-bc40-af3149d25304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746072 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtvz4\" (UniqueName: \"kubernetes.io/projected/6b16e394-6692-4df4-ad2c-5163e126b448-kube-api-access-dtvz4\") pod \"cluster-samples-operator-665b6dd947-56shj\" (UID: \"6b16e394-6692-4df4-ad2c-5163e126b448\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746092 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2lc\" (UniqueName: \"kubernetes.io/projected/26f36385-887c-4459-9c97-b1fb8f8d1d26-kube-api-access-vr2lc\") pod \"openshift-apiserver-operator-796bbdcf4f-psmvp\" (UID: \"26f36385-887c-4459-9c97-b1fb8f8d1d26\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746107 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21c410dc-98d6-4319-9bae-e4025e9fdbb5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-znzwx\" (UID: \"21c410dc-98d6-4319-9bae-e4025e9fdbb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746125 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746148 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746180 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746150 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c134aff-5bc5-4901-8746-5f79fb395b01-console-serving-cert\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746223 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746235 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c134aff-5bc5-4901-8746-5f79fb395b01-console-oauth-config\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746257 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhh9b\" (UniqueName: \"kubernetes.io/projected/0c134aff-5bc5-4901-8746-5f79fb395b01-kube-api-access-bhh9b\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746278 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746295 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f36385-887c-4459-9c97-b1fb8f8d1d26-config\") pod \"openshift-apiserver-operator-796bbdcf4f-psmvp\" (UID: \"26f36385-887c-4459-9c97-b1fb8f8d1d26\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746320 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-oauth-serving-cert\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746337 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746355 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/41c5c97b-d54b-4770-bc40-af3149d25304-machine-approver-tls\") pod \"machine-approver-56656f9798-vf689\" (UID: \"41c5c97b-d54b-4770-bc40-af3149d25304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746410 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50ea62f2-f338-4164-8ecc-3d7d777c0d43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746426 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746454 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746492 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746452 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-console-config\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746560 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65kc6\" (UniqueName: \"kubernetes.io/projected/f859bdf8-f651-407a-a6b8-6c3ae2fe7f63-kube-api-access-65kc6\") pod \"machine-api-operator-5694c8668f-27smd\" (UID: \"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746593 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-client-ca\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746630 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f859bdf8-f651-407a-a6b8-6c3ae2fe7f63-images\") pod \"machine-api-operator-5694c8668f-27smd\" (UID: \"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746661 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv2rp\" (UniqueName: \"kubernetes.io/projected/10a607f0-e1ef-405d-9771-54076793d426-kube-api-access-hv2rp\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746689 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50ea62f2-f338-4164-8ecc-3d7d777c0d43-service-ca-bundle\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746713 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746736 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-etcd-client\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746750 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746764 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-audit-dir\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746779 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxhk9\" (UniqueName: \"kubernetes.io/projected/ea8ad42e-dfd1-486c-85c1-d4ff1bb95707-kube-api-access-lxhk9\") pod \"downloads-7954f5f757-smxt5\" (UID: \"ea8ad42e-dfd1-486c-85c1-d4ff1bb95707\") " pod="openshift-console/downloads-7954f5f757-smxt5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746808 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cfbfb374-e786-4e49-8c80-54ec12c7abcc-audit-dir\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746838 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b16e394-6692-4df4-ad2c-5163e126b448-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-56shj\" (UID: \"6b16e394-6692-4df4-ad2c-5163e126b448\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.746968 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d6b1ed-75bd-4c5a-ab2d-294b7359301d-serving-cert\") pod \"openshift-config-operator-7777fb866f-5bl2q\" (UID: \"b8d6b1ed-75bd-4c5a-ab2d-294b7359301d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.747003 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c410dc-98d6-4319-9bae-e4025e9fdbb5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-znzwx\" (UID: \"21c410dc-98d6-4319-9bae-e4025e9fdbb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.747034 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.747064 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.747079 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cfbfb374-e786-4e49-8c80-54ec12c7abcc-encryption-config\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.747097 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-audit-policies\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.747139 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/11591472-458c-43dc-b51b-2b15987291a0-metrics-tls\") pod \"dns-operator-744455d44c-mw82h\" (UID: \"11591472-458c-43dc-b51b-2b15987291a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-mw82h" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.747164 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckx4f\" (UniqueName: \"kubernetes.io/projected/41c5c97b-d54b-4770-bc40-af3149d25304-kube-api-access-ckx4f\") pod \"machine-approver-56656f9798-vf689\" (UID: \"41c5c97b-d54b-4770-bc40-af3149d25304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.747199 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.747214 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-audit\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.747228 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cfbfb374-e786-4e49-8c80-54ec12c7abcc-etcd-client\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.747266 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw8l6\" (UniqueName: \"kubernetes.io/projected/50ea62f2-f338-4164-8ecc-3d7d777c0d43-kube-api-access-gw8l6\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.744092 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.748797 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.749269 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.749317 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.749446 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.749510 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.749744 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.749944 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.749953 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.750049 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.750331 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.750560 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.750652 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.750814 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.750830 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.750891 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w929v"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.750952 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.750976 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.751206 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.751668 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.751828 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.752052 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.752422 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m6452"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.752553 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.752632 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.752954 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-f26tf"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.753246 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.753306 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hshsw"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.753343 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.757237 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nnbzm"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.757995 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.758253 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.758442 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.758472 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.758708 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.758913 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.758932 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.759051 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.759180 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.759228 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.764094 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.764124 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mw82h"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.764135 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.764145 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.764222 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.767682 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kxmn5"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.767876 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.781542 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.784285 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.784327 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.784345 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-smxt5"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.784358 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.788601 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.788966 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.793818 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.795347 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.796979 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.802505 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4sh2x"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.808553 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.809276 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.809454 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.811319 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jfv5q"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.812203 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.812392 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.813893 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ptlw8"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.815289 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffqnf"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.817541 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.818010 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.819657 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qdnqc"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.820413 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qdnqc" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.821024 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nf9v5"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.822061 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.822281 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ndj72"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.823014 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.823354 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.824581 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gnnls"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.825610 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.826811 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ln57b"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.827923 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.829542 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m6452"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.830706 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.831789 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w929v"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.833052 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.834433 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-27smd"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.835620 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.836704 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nf9v5"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.837895 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-sb2zn"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.838806 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sb2zn" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.839048 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nnbzm"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.840686 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hshsw"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.841921 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.843199 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.843517 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.845257 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.845886 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qdnqc"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.850457 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-console-config\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.850596 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-client-ca\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.850718 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05baa2fe-0ea7-41b3-9e70-04412e0e5658-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nfqps\" (UID: \"05baa2fe-0ea7-41b3-9e70-04412e0e5658\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.851011 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50ea62f2-f338-4164-8ecc-3d7d777c0d43-service-ca-bundle\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.851120 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05baa2fe-0ea7-41b3-9e70-04412e0e5658-config\") pod \"kube-controller-manager-operator-78b949d7b-nfqps\" (UID: \"05baa2fe-0ea7-41b3-9e70-04412e0e5658\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.851243 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhkfr\" (UniqueName: \"kubernetes.io/projected/a0b2c74a-c566-4614-aa26-65f67ae6fc94-kube-api-access-nhkfr\") pod \"machine-config-controller-84d6567774-w929v\" (UID: \"a0b2c74a-c566-4614-aa26-65f67ae6fc94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.851346 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv2rp\" (UniqueName: \"kubernetes.io/projected/10a607f0-e1ef-405d-9771-54076793d426-kube-api-access-hv2rp\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.851478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-audit-dir\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.851580 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r6r8\" (UID: \"d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.851686 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cfbfb374-e786-4e49-8c80-54ec12c7abcc-audit-dir\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.851781 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d6b1ed-75bd-4c5a-ab2d-294b7359301d-serving-cert\") pod \"openshift-config-operator-7777fb866f-5bl2q\" (UID: \"b8d6b1ed-75bd-4c5a-ab2d-294b7359301d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.851880 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c410dc-98d6-4319-9bae-e4025e9fdbb5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-znzwx\" (UID: \"21c410dc-98d6-4319-9bae-e4025e9fdbb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.851981 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/168fbd70-f065-4a1d-965a-c1d67493a528-metrics-certs\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.852083 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckx4f\" (UniqueName: \"kubernetes.io/projected/41c5c97b-d54b-4770-bc40-af3149d25304-kube-api-access-ckx4f\") pod \"machine-approver-56656f9798-vf689\" (UID: \"41c5c97b-d54b-4770-bc40-af3149d25304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.852182 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cfbfb374-e786-4e49-8c80-54ec12c7abcc-encryption-config\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.852266 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cfbfb374-e786-4e49-8c80-54ec12c7abcc-etcd-client\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.852339 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt8t6\" (UniqueName: \"kubernetes.io/projected/e9ff9de8-25f7-408a-834b-ec4edf3c98e5-kube-api-access-dt8t6\") pod \"ingress-operator-5b745b69d9-t72kw\" (UID: \"e9ff9de8-25f7-408a-834b-ec4edf3c98e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.852457 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a91f7a18-161e-4089-bc09-835d5b33b65f-serving-cert\") pod \"console-operator-58897d9998-jfv5q\" (UID: \"a91f7a18-161e-4089-bc09-835d5b33b65f\") " pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.852564 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78nbs\" (UniqueName: \"kubernetes.io/projected/9093abb1-e6d3-48d8-972a-88c8f8ec9fe2-kube-api-access-78nbs\") pod \"package-server-manager-789f6589d5-4mr8q\" (UID: \"9093abb1-e6d3-48d8-972a-88c8f8ec9fe2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.852674 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.852792 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r6r8\" (UID: \"d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.852900 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05baa2fe-0ea7-41b3-9e70-04412e0e5658-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nfqps\" (UID: \"05baa2fe-0ea7-41b3-9e70-04412e0e5658\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.853007 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2b86f96-71d0-4398-8963-5ad320ad5f2f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vz6vk\" (UID: \"b2b86f96-71d0-4398-8963-5ad320ad5f2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.853117 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4mc5\" (UniqueName: \"kubernetes.io/projected/e1235317-93be-4e63-b980-259191d32b82-kube-api-access-d4mc5\") pod \"migrator-59844c95c7-ndj72\" (UID: \"e1235317-93be-4e63-b980-259191d32b82\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ndj72" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.853210 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-audit-dir\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.851283 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.853259 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-client-ca\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.853503 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbrlh\" (UniqueName: \"kubernetes.io/projected/21e078d9-a539-4626-b30f-908b8e866a7a-kube-api-access-gbrlh\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.853679 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.853803 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cfbfb374-e786-4e49-8c80-54ec12c7abcc-audit-dir\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.853903 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ff9de8-25f7-408a-834b-ec4edf3c98e5-trusted-ca\") pod \"ingress-operator-5b745b69d9-t72kw\" (UID: \"e9ff9de8-25f7-408a-834b-ec4edf3c98e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.854004 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f36385-887c-4459-9c97-b1fb8f8d1d26-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-psmvp\" (UID: \"26f36385-887c-4459-9c97-b1fb8f8d1d26\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.854097 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f84a125-f112-4f6a-ae37-63cf387032c7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fv48p\" (UID: \"2f84a125-f112-4f6a-ae37-63cf387032c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.854204 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.854305 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.854649 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwxv5\" (UniqueName: \"kubernetes.io/projected/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-kube-api-access-qwxv5\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.854754 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-image-import-ca\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.854856 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ea62f2-f338-4164-8ecc-3d7d777c0d43-config\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.854947 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cca81879-8d18-4463-8db2-b17f8c24874c-images\") pod \"machine-config-operator-74547568cd-rbbgx\" (UID: \"cca81879-8d18-4463-8db2-b17f8c24874c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.855037 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-audit-policies\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.855163 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.855171 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg6jc\" (UniqueName: \"kubernetes.io/projected/a91f7a18-161e-4089-bc09-835d5b33b65f-kube-api-access-jg6jc\") pod \"console-operator-58897d9998-jfv5q\" (UID: \"a91f7a18-161e-4089-bc09-835d5b33b65f\") " pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.855390 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/782cf494-3079-47fe-8f6c-f7d5731a5b69-serving-cert\") pod \"route-controller-manager-6576b87f9c-6jmzh\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.855522 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/168fbd70-f065-4a1d-965a-c1d67493a528-stats-auth\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.855627 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d862f1c7-a1fa-483d-8fd7-9b7e7ef51568-profile-collector-cert\") pod \"catalog-operator-68c6474976-lpqtr\" (UID: \"d862f1c7-a1fa-483d-8fd7-9b7e7ef51568\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.855729 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10a607f0-e1ef-405d-9771-54076793d426-serving-cert\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.855817 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvjct\" (UniqueName: \"kubernetes.io/projected/66a3882a-e9bc-40d4-b51f-e47d9354f53a-kube-api-access-nvjct\") pod \"marketplace-operator-79b997595-ptlw8\" (UID: \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.855912 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-encryption-config\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.856013 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a0bc7a3-053e-495b-ac9f-21322dace59d-srv-cert\") pod \"olm-operator-6b444d44fb-8xvnf\" (UID: \"6a0bc7a3-053e-495b-ac9f-21322dace59d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.856456 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-config\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.856564 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.856657 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7zpq\" (UniqueName: \"kubernetes.io/projected/5bcd665d-3ce2-437e-abcd-43175c1395c8-kube-api-access-b7zpq\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.856766 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-serving-cert\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.856853 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ff9de8-25f7-408a-834b-ec4edf3c98e5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t72kw\" (UID: \"e9ff9de8-25f7-408a-834b-ec4edf3c98e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.856926 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm8s4\" (UniqueName: \"kubernetes.io/projected/168fbd70-f065-4a1d-965a-c1d67493a528-kube-api-access-wm8s4\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.857015 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2c52\" (UniqueName: \"kubernetes.io/projected/cfbfb374-e786-4e49-8c80-54ec12c7abcc-kube-api-access-c2c52\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.857100 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfl8h\" (UniqueName: \"kubernetes.io/projected/b8d6b1ed-75bd-4c5a-ab2d-294b7359301d-kube-api-access-mfl8h\") pod \"openshift-config-operator-7777fb866f-5bl2q\" (UID: \"b8d6b1ed-75bd-4c5a-ab2d-294b7359301d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.857192 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.857285 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5snd\" (UniqueName: \"kubernetes.io/projected/9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61-kube-api-access-m5snd\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffn5l\" (UID: \"9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.857419 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f84a125-f112-4f6a-ae37-63cf387032c7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fv48p\" (UID: \"2f84a125-f112-4f6a-ae37-63cf387032c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858163 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-trusted-ca-bundle\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858244 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffn5l\" (UID: \"9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858320 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c5c97b-d54b-4770-bc40-af3149d25304-config\") pod \"machine-approver-56656f9798-vf689\" (UID: \"41c5c97b-d54b-4770-bc40-af3149d25304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858420 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtvz4\" (UniqueName: \"kubernetes.io/projected/6b16e394-6692-4df4-ad2c-5163e126b448-kube-api-access-dtvz4\") pod \"cluster-samples-operator-665b6dd947-56shj\" (UID: \"6b16e394-6692-4df4-ad2c-5163e126b448\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858497 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2lc\" (UniqueName: \"kubernetes.io/projected/26f36385-887c-4459-9c97-b1fb8f8d1d26-kube-api-access-vr2lc\") pod \"openshift-apiserver-operator-796bbdcf4f-psmvp\" (UID: \"26f36385-887c-4459-9c97-b1fb8f8d1d26\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858564 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cca81879-8d18-4463-8db2-b17f8c24874c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rbbgx\" (UID: \"cca81879-8d18-4463-8db2-b17f8c24874c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858643 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9ff9de8-25f7-408a-834b-ec4edf3c98e5-metrics-tls\") pod \"ingress-operator-5b745b69d9-t72kw\" (UID: \"e9ff9de8-25f7-408a-834b-ec4edf3c98e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858717 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41c5c97b-d54b-4770-bc40-af3149d25304-auth-proxy-config\") pod \"machine-approver-56656f9798-vf689\" (UID: \"41c5c97b-d54b-4770-bc40-af3149d25304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858791 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-ca\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858862 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858928 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a91f7a18-161e-4089-bc09-835d5b33b65f-trusted-ca\") pod \"console-operator-58897d9998-jfv5q\" (UID: \"a91f7a18-161e-4089-bc09-835d5b33b65f\") " pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858998 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d862f1c7-a1fa-483d-8fd7-9b7e7ef51568-srv-cert\") pod \"catalog-operator-68c6474976-lpqtr\" (UID: \"d862f1c7-a1fa-483d-8fd7-9b7e7ef51568\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.859084 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-client\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.859267 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0b2c74a-c566-4614-aa26-65f67ae6fc94-proxy-tls\") pod \"machine-config-controller-84d6567774-w929v\" (UID: \"a0b2c74a-c566-4614-aa26-65f67ae6fc94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.859352 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f36385-887c-4459-9c97-b1fb8f8d1d26-config\") pod \"openshift-apiserver-operator-796bbdcf4f-psmvp\" (UID: \"26f36385-887c-4459-9c97-b1fb8f8d1d26\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.859450 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858569 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.859565 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2b86f96-71d0-4398-8963-5ad320ad5f2f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vz6vk\" (UID: \"b2b86f96-71d0-4398-8963-5ad320ad5f2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.862710 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ea62f2-f338-4164-8ecc-3d7d777c0d43-config\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.862751 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-image-import-ca\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.862754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.862816 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-oauth-serving-cert\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.862953 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65kc6\" (UniqueName: \"kubernetes.io/projected/f859bdf8-f651-407a-a6b8-6c3ae2fe7f63-kube-api-access-65kc6\") pod \"machine-api-operator-5694c8668f-27smd\" (UID: \"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.862974 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66a3882a-e9bc-40d4-b51f-e47d9354f53a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ptlw8\" (UID: \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863018 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r6r8\" (UID: \"d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863048 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p8rw\" (UniqueName: \"kubernetes.io/projected/3d21106f-1d62-4e2f-98ac-5411f66d8352-kube-api-access-2p8rw\") pod \"multus-admission-controller-857f4d67dd-nnbzm\" (UID: \"3d21106f-1d62-4e2f-98ac-5411f66d8352\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863068 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cca81879-8d18-4463-8db2-b17f8c24874c-proxy-tls\") pod \"machine-config-operator-74547568cd-rbbgx\" (UID: \"cca81879-8d18-4463-8db2-b17f8c24874c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863071 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863091 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zg62\" (UniqueName: \"kubernetes.io/projected/cca81879-8d18-4463-8db2-b17f8c24874c-kube-api-access-4zg62\") pod \"machine-config-operator-74547568cd-rbbgx\" (UID: \"cca81879-8d18-4463-8db2-b17f8c24874c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863123 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f859bdf8-f651-407a-a6b8-6c3ae2fe7f63-images\") pod \"machine-api-operator-5694c8668f-27smd\" (UID: \"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863147 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-etcd-client\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863198 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863231 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxhk9\" (UniqueName: \"kubernetes.io/projected/ea8ad42e-dfd1-486c-85c1-d4ff1bb95707-kube-api-access-lxhk9\") pod \"downloads-7954f5f757-smxt5\" (UID: \"ea8ad42e-dfd1-486c-85c1-d4ff1bb95707\") " pod="openshift-console/downloads-7954f5f757-smxt5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863273 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863312 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g5b9\" (UniqueName: \"kubernetes.io/projected/d862f1c7-a1fa-483d-8fd7-9b7e7ef51568-kube-api-access-4g5b9\") pod \"catalog-operator-68c6474976-lpqtr\" (UID: \"d862f1c7-a1fa-483d-8fd7-9b7e7ef51568\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863336 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b16e394-6692-4df4-ad2c-5163e126b448-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-56shj\" (UID: \"6b16e394-6692-4df4-ad2c-5163e126b448\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863376 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66a3882a-e9bc-40d4-b51f-e47d9354f53a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ptlw8\" (UID: \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863403 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/168fbd70-f065-4a1d-965a-c1d67493a528-service-ca-bundle\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863431 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863453 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-audit-policies\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863470 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/11591472-458c-43dc-b51b-2b15987291a0-metrics-tls\") pod \"dns-operator-744455d44c-mw82h\" (UID: \"11591472-458c-43dc-b51b-2b15987291a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-mw82h" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863496 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-service-ca\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863531 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863557 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a91f7a18-161e-4089-bc09-835d5b33b65f-config\") pod \"console-operator-58897d9998-jfv5q\" (UID: \"a91f7a18-161e-4089-bc09-835d5b33b65f\") " pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863582 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-audit\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863603 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-config\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863603 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-oauth-serving-cert\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863626 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw8l6\" (UniqueName: \"kubernetes.io/projected/50ea62f2-f338-4164-8ecc-3d7d777c0d43-kube-api-access-gw8l6\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863679 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cac1e607-7735-4696-9666-34cb5ecb4857-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8gjtp\" (UID: \"cac1e607-7735-4696-9666-34cb5ecb4857\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863732 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpwtv\" (UniqueName: \"kubernetes.io/projected/2f84a125-f112-4f6a-ae37-63cf387032c7-kube-api-access-qpwtv\") pod \"cluster-image-registry-operator-dc59b4c8b-fv48p\" (UID: \"2f84a125-f112-4f6a-ae37-63cf387032c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858584 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d6b1ed-75bd-4c5a-ab2d-294b7359301d-serving-cert\") pod \"openshift-config-operator-7777fb866f-5bl2q\" (UID: \"b8d6b1ed-75bd-4c5a-ab2d-294b7359301d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863787 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/782cf494-3079-47fe-8f6c-f7d5731a5b69-config\") pod \"route-controller-manager-6576b87f9c-6jmzh\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863809 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-f26tf"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863814 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rbqx\" (UniqueName: \"kubernetes.io/projected/782cf494-3079-47fe-8f6c-f7d5731a5b69-kube-api-access-5rbqx\") pod \"route-controller-manager-6576b87f9c-6jmzh\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863885 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfbfb374-e786-4e49-8c80-54ec12c7abcc-serving-cert\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863921 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9093abb1-e6d3-48d8-972a-88c8f8ec9fe2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4mr8q\" (UID: \"9093abb1-e6d3-48d8-972a-88c8f8ec9fe2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863952 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.863986 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghshr\" (UniqueName: \"kubernetes.io/projected/6a0bc7a3-053e-495b-ac9f-21322dace59d-kube-api-access-ghshr\") pod \"olm-operator-6b444d44fb-8xvnf\" (UID: \"6a0bc7a3-053e-495b-ac9f-21322dace59d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frhpq\" (UniqueName: \"kubernetes.io/projected/b2b86f96-71d0-4398-8963-5ad320ad5f2f-kube-api-access-frhpq\") pod \"kube-storage-version-migrator-operator-b67b599dd-vz6vk\" (UID: \"b2b86f96-71d0-4398-8963-5ad320ad5f2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864074 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-service-ca\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864108 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h274\" (UniqueName: \"kubernetes.io/projected/cac1e607-7735-4696-9666-34cb5ecb4857-kube-api-access-9h274\") pod \"control-plane-machine-set-operator-78cbb6b69f-8gjtp\" (UID: \"cac1e607-7735-4696-9666-34cb5ecb4857\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864143 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0b2c74a-c566-4614-aa26-65f67ae6fc94-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w929v\" (UID: \"a0b2c74a-c566-4614-aa26-65f67ae6fc94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864174 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/168fbd70-f065-4a1d-965a-c1d67493a528-default-certificate\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864203 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bcd665d-3ce2-437e-abcd-43175c1395c8-serving-cert\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/782cf494-3079-47fe-8f6c-f7d5731a5b69-client-ca\") pod \"route-controller-manager-6576b87f9c-6jmzh\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864291 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c410dc-98d6-4319-9bae-e4025e9fdbb5-config\") pod \"kube-apiserver-operator-766d6c64bb-znzwx\" (UID: \"21c410dc-98d6-4319-9bae-e4025e9fdbb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864320 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f859bdf8-f651-407a-a6b8-6c3ae2fe7f63-config\") pod \"machine-api-operator-5694c8668f-27smd\" (UID: \"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864355 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg8g8\" (UniqueName: \"kubernetes.io/projected/11591472-458c-43dc-b51b-2b15987291a0-kube-api-access-kg8g8\") pod \"dns-operator-744455d44c-mw82h\" (UID: \"11591472-458c-43dc-b51b-2b15987291a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-mw82h" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864408 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-config\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864456 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50ea62f2-f338-4164-8ecc-3d7d777c0d43-serving-cert\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864483 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21e078d9-a539-4626-b30f-908b8e866a7a-audit-dir\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864515 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cfbfb374-e786-4e49-8c80-54ec12c7abcc-node-pullsecrets\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864546 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffn5l\" (UID: \"9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864581 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f84a125-f112-4f6a-ae37-63cf387032c7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fv48p\" (UID: \"2f84a125-f112-4f6a-ae37-63cf387032c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864610 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864646 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f859bdf8-f651-407a-a6b8-6c3ae2fe7f63-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-27smd\" (UID: \"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864667 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41c5c97b-d54b-4770-bc40-af3149d25304-auth-proxy-config\") pod \"machine-approver-56656f9798-vf689\" (UID: \"41c5c97b-d54b-4770-bc40-af3149d25304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864681 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.858739 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f36385-887c-4459-9c97-b1fb8f8d1d26-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-psmvp\" (UID: \"26f36385-887c-4459-9c97-b1fb8f8d1d26\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864737 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b8d6b1ed-75bd-4c5a-ab2d-294b7359301d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5bl2q\" (UID: \"b8d6b1ed-75bd-4c5a-ab2d-294b7359301d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864773 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21c410dc-98d6-4319-9bae-e4025e9fdbb5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-znzwx\" (UID: \"21c410dc-98d6-4319-9bae-e4025e9fdbb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864820 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c134aff-5bc5-4901-8746-5f79fb395b01-console-serving-cert\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864852 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c134aff-5bc5-4901-8746-5f79fb395b01-console-oauth-config\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864883 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhh9b\" (UniqueName: \"kubernetes.io/projected/0c134aff-5bc5-4901-8746-5f79fb395b01-kube-api-access-bhh9b\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864917 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864944 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/41c5c97b-d54b-4770-bc40-af3149d25304-machine-approver-tls\") pod \"machine-approver-56656f9798-vf689\" (UID: \"41c5c97b-d54b-4770-bc40-af3149d25304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.864976 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50ea62f2-f338-4164-8ecc-3d7d777c0d43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.865009 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a0bc7a3-053e-495b-ac9f-21322dace59d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8xvnf\" (UID: \"6a0bc7a3-053e-495b-ac9f-21322dace59d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.865034 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3d21106f-1d62-4e2f-98ac-5411f66d8352-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nnbzm\" (UID: \"3d21106f-1d62-4e2f-98ac-5411f66d8352\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.865074 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.865170 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.865790 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cfbfb374-e786-4e49-8c80-54ec12c7abcc-etcd-client\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.866328 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21e078d9-a539-4626-b30f-908b8e866a7a-audit-dir\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.866964 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cfbfb374-e786-4e49-8c80-54ec12c7abcc-encryption-config\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.867115 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bctw8"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.867315 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.867939 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-trusted-ca-bundle\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.868524 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.868612 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cfbfb374-e786-4e49-8c80-54ec12c7abcc-node-pullsecrets\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.869014 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.852328 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-console-config\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.869154 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.869928 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/782cf494-3079-47fe-8f6c-f7d5731a5b69-serving-cert\") pod \"route-controller-manager-6576b87f9c-6jmzh\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.870115 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b16e394-6692-4df4-ad2c-5163e126b448-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-56shj\" (UID: \"6b16e394-6692-4df4-ad2c-5163e126b448\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.870640 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f859bdf8-f651-407a-a6b8-6c3ae2fe7f63-images\") pod \"machine-api-operator-5694c8668f-27smd\" (UID: \"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.871235 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffn5l\" (UID: \"9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.871268 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffn5l\" (UID: \"9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.871337 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bctw8"] Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.871579 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c5c97b-d54b-4770-bc40-af3149d25304-config\") pod \"machine-approver-56656f9798-vf689\" (UID: \"41c5c97b-d54b-4770-bc40-af3149d25304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.871998 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f36385-887c-4459-9c97-b1fb8f8d1d26-config\") pod \"openshift-apiserver-operator-796bbdcf4f-psmvp\" (UID: \"26f36385-887c-4459-9c97-b1fb8f8d1d26\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.873834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.874172 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-encryption-config\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.853772 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50ea62f2-f338-4164-8ecc-3d7d777c0d43-service-ca-bundle\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.874393 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.874614 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.875097 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-service-ca\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.875183 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-audit-policies\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.875569 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f859bdf8-f651-407a-a6b8-6c3ae2fe7f63-config\") pod \"machine-api-operator-5694c8668f-27smd\" (UID: \"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.875870 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-config\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.876082 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-etcd-client\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.876449 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/782cf494-3079-47fe-8f6c-f7d5731a5b69-client-ca\") pod \"route-controller-manager-6576b87f9c-6jmzh\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.876559 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-audit\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.877152 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.877202 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfbfb374-e786-4e49-8c80-54ec12c7abcc-serving-cert\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.877542 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50ea62f2-f338-4164-8ecc-3d7d777c0d43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.877674 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10a607f0-e1ef-405d-9771-54076793d426-serving-cert\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.877750 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.878551 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cfbfb374-e786-4e49-8c80-54ec12c7abcc-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.878821 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/11591472-458c-43dc-b51b-2b15987291a0-metrics-tls\") pod \"dns-operator-744455d44c-mw82h\" (UID: \"11591472-458c-43dc-b51b-2b15987291a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-mw82h" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.878857 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-audit-policies\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.878984 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/782cf494-3079-47fe-8f6c-f7d5731a5b69-config\") pod \"route-controller-manager-6576b87f9c-6jmzh\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.879035 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-serving-cert\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.879279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b8d6b1ed-75bd-4c5a-ab2d-294b7359301d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5bl2q\" (UID: \"b8d6b1ed-75bd-4c5a-ab2d-294b7359301d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.879920 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.879993 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-config\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.880041 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f859bdf8-f651-407a-a6b8-6c3ae2fe7f63-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-27smd\" (UID: \"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.880358 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50ea62f2-f338-4164-8ecc-3d7d777c0d43-serving-cert\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.880868 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/41c5c97b-d54b-4770-bc40-af3149d25304-machine-approver-tls\") pod \"machine-approver-56656f9798-vf689\" (UID: \"41c5c97b-d54b-4770-bc40-af3149d25304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.881560 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.881684 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c134aff-5bc5-4901-8746-5f79fb395b01-console-oauth-config\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.882932 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.883350 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c134aff-5bc5-4901-8746-5f79fb395b01-console-serving-cert\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.902830 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.923238 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.943105 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.963060 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966163 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvjct\" (UniqueName: \"kubernetes.io/projected/66a3882a-e9bc-40d4-b51f-e47d9354f53a-kube-api-access-nvjct\") pod \"marketplace-operator-79b997595-ptlw8\" (UID: \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966200 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a0bc7a3-053e-495b-ac9f-21322dace59d-srv-cert\") pod \"olm-operator-6b444d44fb-8xvnf\" (UID: \"6a0bc7a3-053e-495b-ac9f-21322dace59d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966221 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7zpq\" (UniqueName: \"kubernetes.io/projected/5bcd665d-3ce2-437e-abcd-43175c1395c8-kube-api-access-b7zpq\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966238 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ff9de8-25f7-408a-834b-ec4edf3c98e5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t72kw\" (UID: \"e9ff9de8-25f7-408a-834b-ec4edf3c98e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966254 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm8s4\" (UniqueName: \"kubernetes.io/projected/168fbd70-f065-4a1d-965a-c1d67493a528-kube-api-access-wm8s4\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966290 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f84a125-f112-4f6a-ae37-63cf387032c7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fv48p\" (UID: \"2f84a125-f112-4f6a-ae37-63cf387032c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966320 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cca81879-8d18-4463-8db2-b17f8c24874c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rbbgx\" (UID: \"cca81879-8d18-4463-8db2-b17f8c24874c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966337 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9ff9de8-25f7-408a-834b-ec4edf3c98e5-metrics-tls\") pod \"ingress-operator-5b745b69d9-t72kw\" (UID: \"e9ff9de8-25f7-408a-834b-ec4edf3c98e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966353 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-ca\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966382 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a91f7a18-161e-4089-bc09-835d5b33b65f-trusted-ca\") pod \"console-operator-58897d9998-jfv5q\" (UID: \"a91f7a18-161e-4089-bc09-835d5b33b65f\") " pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966397 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d862f1c7-a1fa-483d-8fd7-9b7e7ef51568-srv-cert\") pod \"catalog-operator-68c6474976-lpqtr\" (UID: \"d862f1c7-a1fa-483d-8fd7-9b7e7ef51568\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966411 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-client\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966431 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0b2c74a-c566-4614-aa26-65f67ae6fc94-proxy-tls\") pod \"machine-config-controller-84d6567774-w929v\" (UID: \"a0b2c74a-c566-4614-aa26-65f67ae6fc94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2b86f96-71d0-4398-8963-5ad320ad5f2f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vz6vk\" (UID: \"b2b86f96-71d0-4398-8963-5ad320ad5f2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66a3882a-e9bc-40d4-b51f-e47d9354f53a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ptlw8\" (UID: \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966501 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r6r8\" (UID: \"d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966520 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p8rw\" (UniqueName: \"kubernetes.io/projected/3d21106f-1d62-4e2f-98ac-5411f66d8352-kube-api-access-2p8rw\") pod \"multus-admission-controller-857f4d67dd-nnbzm\" (UID: \"3d21106f-1d62-4e2f-98ac-5411f66d8352\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966534 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cca81879-8d18-4463-8db2-b17f8c24874c-proxy-tls\") pod \"machine-config-operator-74547568cd-rbbgx\" (UID: \"cca81879-8d18-4463-8db2-b17f8c24874c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966549 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zg62\" (UniqueName: \"kubernetes.io/projected/cca81879-8d18-4463-8db2-b17f8c24874c-kube-api-access-4zg62\") pod \"machine-config-operator-74547568cd-rbbgx\" (UID: \"cca81879-8d18-4463-8db2-b17f8c24874c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966572 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g5b9\" (UniqueName: \"kubernetes.io/projected/d862f1c7-a1fa-483d-8fd7-9b7e7ef51568-kube-api-access-4g5b9\") pod \"catalog-operator-68c6474976-lpqtr\" (UID: \"d862f1c7-a1fa-483d-8fd7-9b7e7ef51568\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966587 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66a3882a-e9bc-40d4-b51f-e47d9354f53a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ptlw8\" (UID: \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966603 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/168fbd70-f065-4a1d-965a-c1d67493a528-service-ca-bundle\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966621 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-service-ca\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966644 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a91f7a18-161e-4089-bc09-835d5b33b65f-config\") pod \"console-operator-58897d9998-jfv5q\" (UID: \"a91f7a18-161e-4089-bc09-835d5b33b65f\") " pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966662 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-config\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966684 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cac1e607-7735-4696-9666-34cb5ecb4857-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8gjtp\" (UID: \"cac1e607-7735-4696-9666-34cb5ecb4857\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966701 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwtv\" (UniqueName: \"kubernetes.io/projected/2f84a125-f112-4f6a-ae37-63cf387032c7-kube-api-access-qpwtv\") pod \"cluster-image-registry-operator-dc59b4c8b-fv48p\" (UID: \"2f84a125-f112-4f6a-ae37-63cf387032c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966728 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9093abb1-e6d3-48d8-972a-88c8f8ec9fe2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4mr8q\" (UID: \"9093abb1-e6d3-48d8-972a-88c8f8ec9fe2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966746 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghshr\" (UniqueName: \"kubernetes.io/projected/6a0bc7a3-053e-495b-ac9f-21322dace59d-kube-api-access-ghshr\") pod \"olm-operator-6b444d44fb-8xvnf\" (UID: \"6a0bc7a3-053e-495b-ac9f-21322dace59d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966769 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frhpq\" (UniqueName: \"kubernetes.io/projected/b2b86f96-71d0-4398-8963-5ad320ad5f2f-kube-api-access-frhpq\") pod \"kube-storage-version-migrator-operator-b67b599dd-vz6vk\" (UID: \"b2b86f96-71d0-4398-8963-5ad320ad5f2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966788 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h274\" (UniqueName: \"kubernetes.io/projected/cac1e607-7735-4696-9666-34cb5ecb4857-kube-api-access-9h274\") pod \"control-plane-machine-set-operator-78cbb6b69f-8gjtp\" (UID: \"cac1e607-7735-4696-9666-34cb5ecb4857\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966804 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0b2c74a-c566-4614-aa26-65f67ae6fc94-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w929v\" (UID: \"a0b2c74a-c566-4614-aa26-65f67ae6fc94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966819 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/168fbd70-f065-4a1d-965a-c1d67493a528-default-certificate\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966834 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bcd665d-3ce2-437e-abcd-43175c1395c8-serving-cert\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966870 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f84a125-f112-4f6a-ae37-63cf387032c7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fv48p\" (UID: \"2f84a125-f112-4f6a-ae37-63cf387032c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966906 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a0bc7a3-053e-495b-ac9f-21322dace59d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8xvnf\" (UID: \"6a0bc7a3-053e-495b-ac9f-21322dace59d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966921 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3d21106f-1d62-4e2f-98ac-5411f66d8352-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nnbzm\" (UID: \"3d21106f-1d62-4e2f-98ac-5411f66d8352\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966940 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05baa2fe-0ea7-41b3-9e70-04412e0e5658-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nfqps\" (UID: \"05baa2fe-0ea7-41b3-9e70-04412e0e5658\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966958 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05baa2fe-0ea7-41b3-9e70-04412e0e5658-config\") pod \"kube-controller-manager-operator-78b949d7b-nfqps\" (UID: \"05baa2fe-0ea7-41b3-9e70-04412e0e5658\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966973 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhkfr\" (UniqueName: \"kubernetes.io/projected/a0b2c74a-c566-4614-aa26-65f67ae6fc94-kube-api-access-nhkfr\") pod \"machine-config-controller-84d6567774-w929v\" (UID: \"a0b2c74a-c566-4614-aa26-65f67ae6fc94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.966995 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r6r8\" (UID: \"d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967018 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/168fbd70-f065-4a1d-965a-c1d67493a528-metrics-certs\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967041 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt8t6\" (UniqueName: \"kubernetes.io/projected/e9ff9de8-25f7-408a-834b-ec4edf3c98e5-kube-api-access-dt8t6\") pod \"ingress-operator-5b745b69d9-t72kw\" (UID: \"e9ff9de8-25f7-408a-834b-ec4edf3c98e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967058 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a91f7a18-161e-4089-bc09-835d5b33b65f-serving-cert\") pod \"console-operator-58897d9998-jfv5q\" (UID: \"a91f7a18-161e-4089-bc09-835d5b33b65f\") " pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967073 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78nbs\" (UniqueName: \"kubernetes.io/projected/9093abb1-e6d3-48d8-972a-88c8f8ec9fe2-kube-api-access-78nbs\") pod \"package-server-manager-789f6589d5-4mr8q\" (UID: \"9093abb1-e6d3-48d8-972a-88c8f8ec9fe2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967067 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cca81879-8d18-4463-8db2-b17f8c24874c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rbbgx\" (UID: \"cca81879-8d18-4463-8db2-b17f8c24874c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967089 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r6r8\" (UID: \"d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967106 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05baa2fe-0ea7-41b3-9e70-04412e0e5658-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nfqps\" (UID: \"05baa2fe-0ea7-41b3-9e70-04412e0e5658\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967123 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2b86f96-71d0-4398-8963-5ad320ad5f2f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vz6vk\" (UID: \"b2b86f96-71d0-4398-8963-5ad320ad5f2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967138 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4mc5\" (UniqueName: \"kubernetes.io/projected/e1235317-93be-4e63-b980-259191d32b82-kube-api-access-d4mc5\") pod \"migrator-59844c95c7-ndj72\" (UID: \"e1235317-93be-4e63-b980-259191d32b82\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ndj72" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967176 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ff9de8-25f7-408a-834b-ec4edf3c98e5-trusted-ca\") pod \"ingress-operator-5b745b69d9-t72kw\" (UID: \"e9ff9de8-25f7-408a-834b-ec4edf3c98e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967192 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f84a125-f112-4f6a-ae37-63cf387032c7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fv48p\" (UID: \"2f84a125-f112-4f6a-ae37-63cf387032c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967245 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cca81879-8d18-4463-8db2-b17f8c24874c-images\") pod \"machine-config-operator-74547568cd-rbbgx\" (UID: \"cca81879-8d18-4463-8db2-b17f8c24874c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967262 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg6jc\" (UniqueName: \"kubernetes.io/projected/a91f7a18-161e-4089-bc09-835d5b33b65f-kube-api-access-jg6jc\") pod \"console-operator-58897d9998-jfv5q\" (UID: \"a91f7a18-161e-4089-bc09-835d5b33b65f\") " pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967280 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/168fbd70-f065-4a1d-965a-c1d67493a528-stats-auth\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967299 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d862f1c7-a1fa-483d-8fd7-9b7e7ef51568-profile-collector-cert\") pod \"catalog-operator-68c6474976-lpqtr\" (UID: \"d862f1c7-a1fa-483d-8fd7-9b7e7ef51568\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.967612 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2b86f96-71d0-4398-8963-5ad320ad5f2f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vz6vk\" (UID: \"b2b86f96-71d0-4398-8963-5ad320ad5f2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.968717 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r6r8\" (UID: \"d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.969264 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ff9de8-25f7-408a-834b-ec4edf3c98e5-trusted-ca\") pod \"ingress-operator-5b745b69d9-t72kw\" (UID: \"e9ff9de8-25f7-408a-834b-ec4edf3c98e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.969779 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0b2c74a-c566-4614-aa26-65f67ae6fc94-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w929v\" (UID: \"a0b2c74a-c566-4614-aa26-65f67ae6fc94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.969849 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f84a125-f112-4f6a-ae37-63cf387032c7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fv48p\" (UID: \"2f84a125-f112-4f6a-ae37-63cf387032c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.969880 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9ff9de8-25f7-408a-834b-ec4edf3c98e5-metrics-tls\") pod \"ingress-operator-5b745b69d9-t72kw\" (UID: \"e9ff9de8-25f7-408a-834b-ec4edf3c98e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.970113 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05baa2fe-0ea7-41b3-9e70-04412e0e5658-config\") pod \"kube-controller-manager-operator-78b949d7b-nfqps\" (UID: \"05baa2fe-0ea7-41b3-9e70-04412e0e5658\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.970229 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f84a125-f112-4f6a-ae37-63cf387032c7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fv48p\" (UID: \"2f84a125-f112-4f6a-ae37-63cf387032c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.971823 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05baa2fe-0ea7-41b3-9e70-04412e0e5658-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nfqps\" (UID: \"05baa2fe-0ea7-41b3-9e70-04412e0e5658\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.983685 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r6r8\" (UID: \"d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.984039 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 15:57:37 crc kubenswrapper[4778]: I1205 15:57:37.991543 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2b86f96-71d0-4398-8963-5ad320ad5f2f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vz6vk\" (UID: \"b2b86f96-71d0-4398-8963-5ad320ad5f2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.002913 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.024056 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.043179 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.062837 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.071907 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/168fbd70-f065-4a1d-965a-c1d67493a528-stats-auth\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.084125 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.096136 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/168fbd70-f065-4a1d-965a-c1d67493a528-default-certificate\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.104874 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.123351 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.131298 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/168fbd70-f065-4a1d-965a-c1d67493a528-metrics-certs\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.143308 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.148725 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/168fbd70-f065-4a1d-965a-c1d67493a528-service-ca-bundle\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.163949 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.183661 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.187703 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c410dc-98d6-4319-9bae-e4025e9fdbb5-config\") pod \"kube-apiserver-operator-766d6c64bb-znzwx\" (UID: \"21c410dc-98d6-4319-9bae-e4025e9fdbb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.204458 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.219122 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c410dc-98d6-4319-9bae-e4025e9fdbb5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-znzwx\" (UID: \"21c410dc-98d6-4319-9bae-e4025e9fdbb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.224057 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.244723 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.263712 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.269308 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.284965 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.289474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a91f7a18-161e-4089-bc09-835d5b33b65f-config\") pod \"console-operator-58897d9998-jfv5q\" (UID: \"a91f7a18-161e-4089-bc09-835d5b33b65f\") " pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.304541 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.312814 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66a3882a-e9bc-40d4-b51f-e47d9354f53a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ptlw8\" (UID: \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.324301 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.355203 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.359915 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66a3882a-e9bc-40d4-b51f-e47d9354f53a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ptlw8\" (UID: \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.363821 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.384073 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.413702 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.418637 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a91f7a18-161e-4089-bc09-835d5b33b65f-trusted-ca\") pod \"console-operator-58897d9998-jfv5q\" (UID: \"a91f7a18-161e-4089-bc09-835d5b33b65f\") " pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.423538 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.443825 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.463175 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.473015 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a91f7a18-161e-4089-bc09-835d5b33b65f-serving-cert\") pod \"console-operator-58897d9998-jfv5q\" (UID: \"a91f7a18-161e-4089-bc09-835d5b33b65f\") " pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.484424 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.493857 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cac1e607-7735-4696-9666-34cb5ecb4857-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8gjtp\" (UID: \"cac1e607-7735-4696-9666-34cb5ecb4857\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.504402 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.523872 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.543655 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.564548 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.584223 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.590635 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cca81879-8d18-4463-8db2-b17f8c24874c-images\") pod \"machine-config-operator-74547568cd-rbbgx\" (UID: \"cca81879-8d18-4463-8db2-b17f8c24874c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.603597 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.613842 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cca81879-8d18-4463-8db2-b17f8c24874c-proxy-tls\") pod \"machine-config-operator-74547568cd-rbbgx\" (UID: \"cca81879-8d18-4463-8db2-b17f8c24874c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.623935 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.643719 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.663348 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.672062 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d862f1c7-a1fa-483d-8fd7-9b7e7ef51568-profile-collector-cert\") pod \"catalog-operator-68c6474976-lpqtr\" (UID: \"d862f1c7-a1fa-483d-8fd7-9b7e7ef51568\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.673109 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a0bc7a3-053e-495b-ac9f-21322dace59d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8xvnf\" (UID: \"6a0bc7a3-053e-495b-ac9f-21322dace59d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.683789 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.703833 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.712607 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a0bc7a3-053e-495b-ac9f-21322dace59d-srv-cert\") pod \"olm-operator-6b444d44fb-8xvnf\" (UID: \"6a0bc7a3-053e-495b-ac9f-21322dace59d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.723725 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.761784 4778 request.go:700] Waited for 1.010411369s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackage-server-manager-serving-cert&limit=500&resourceVersion=0 Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.763642 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.773549 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9093abb1-e6d3-48d8-972a-88c8f8ec9fe2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4mr8q\" (UID: \"9093abb1-e6d3-48d8-972a-88c8f8ec9fe2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.784043 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.791628 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0b2c74a-c566-4614-aa26-65f67ae6fc94-proxy-tls\") pod \"machine-config-controller-84d6567774-w929v\" (UID: \"a0b2c74a-c566-4614-aa26-65f67ae6fc94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.804148 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.823882 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.844245 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.863435 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.884704 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.904671 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.924135 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.944289 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.963050 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.967355 4778 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-client: failed to sync secret cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.967658 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-client podName:5bcd665d-3ce2-437e-abcd-43175c1395c8 nodeName:}" failed. No retries permitted until 2025-12-05 15:57:39.467627524 +0000 UTC m=+146.571423934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-client") pod "etcd-operator-b45778765-f26tf" (UID: "5bcd665d-3ce2-437e-abcd-43175c1395c8") : failed to sync secret cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.967687 4778 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.968091 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-config podName:5bcd665d-3ce2-437e-abcd-43175c1395c8 nodeName:}" failed. No retries permitted until 2025-12-05 15:57:39.468066376 +0000 UTC m=+146.571862796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-config") pod "etcd-operator-b45778765-f26tf" (UID: "5bcd665d-3ce2-437e-abcd-43175c1395c8") : failed to sync configmap cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.967447 4778 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.967436 4778 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.968677 4778 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.967719 4778 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.968587 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d862f1c7-a1fa-483d-8fd7-9b7e7ef51568-srv-cert podName:d862f1c7-a1fa-483d-8fd7-9b7e7ef51568 nodeName:}" failed. No retries permitted until 2025-12-05 15:57:39.468564979 +0000 UTC m=+146.572361399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d862f1c7-a1fa-483d-8fd7-9b7e7ef51568-srv-cert") pod "catalog-operator-68c6474976-lpqtr" (UID: "d862f1c7-a1fa-483d-8fd7-9b7e7ef51568") : failed to sync secret cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.968908 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-ca podName:5bcd665d-3ce2-437e-abcd-43175c1395c8 nodeName:}" failed. No retries permitted until 2025-12-05 15:57:39.468877517 +0000 UTC m=+146.572673937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-ca" (UniqueName: "kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-ca") pod "etcd-operator-b45778765-f26tf" (UID: "5bcd665d-3ce2-437e-abcd-43175c1395c8") : failed to sync configmap cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.968998 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5bcd665d-3ce2-437e-abcd-43175c1395c8-serving-cert podName:5bcd665d-3ce2-437e-abcd-43175c1395c8 nodeName:}" failed. No retries permitted until 2025-12-05 15:57:39.4689754 +0000 UTC m=+146.572771820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5bcd665d-3ce2-437e-abcd-43175c1395c8-serving-cert") pod "etcd-operator-b45778765-f26tf" (UID: "5bcd665d-3ce2-437e-abcd-43175c1395c8") : failed to sync secret cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.969087 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-service-ca podName:5bcd665d-3ce2-437e-abcd-43175c1395c8 nodeName:}" failed. No retries permitted until 2025-12-05 15:57:39.469016691 +0000 UTC m=+146.572813111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-service-ca" (UniqueName: "kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-service-ca") pod "etcd-operator-b45778765-f26tf" (UID: "5bcd665d-3ce2-437e-abcd-43175c1395c8") : failed to sync configmap cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.969508 4778 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: E1205 15:57:38.969743 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d21106f-1d62-4e2f-98ac-5411f66d8352-webhook-certs podName:3d21106f-1d62-4e2f-98ac-5411f66d8352 nodeName:}" failed. No retries permitted until 2025-12-05 15:57:39.469718991 +0000 UTC m=+146.573515411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3d21106f-1d62-4e2f-98ac-5411f66d8352-webhook-certs") pod "multus-admission-controller-857f4d67dd-nnbzm" (UID: "3d21106f-1d62-4e2f-98ac-5411f66d8352") : failed to sync secret cache: timed out waiting for the condition Dec 05 15:57:38 crc kubenswrapper[4778]: I1205 15:57:38.983026 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.003640 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.023414 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.043780 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.063486 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.084711 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.088036 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.089293 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.090227 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.092459 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.105838 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.124102 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.143733 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.164711 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.176779 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.184692 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.190893 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:39 crc kubenswrapper[4778]: E1205 15:57:39.191044 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:59:41.191023421 +0000 UTC m=+268.294819851 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.191425 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.191477 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.196228 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.200933 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.201265 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.204996 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.224042 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.244275 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.263914 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.268723 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.284410 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.304740 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.324620 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.345074 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.364284 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.386434 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.405264 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.424809 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.443643 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.463402 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.484054 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.498082 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-ca\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.498152 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d862f1c7-a1fa-483d-8fd7-9b7e7ef51568-srv-cert\") pod \"catalog-operator-68c6474976-lpqtr\" (UID: \"d862f1c7-a1fa-483d-8fd7-9b7e7ef51568\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.498195 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-client\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.498330 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-service-ca\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.498418 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-config\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.498520 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bcd665d-3ce2-437e-abcd-43175c1395c8-serving-cert\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.498613 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3d21106f-1d62-4e2f-98ac-5411f66d8352-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nnbzm\" (UID: \"3d21106f-1d62-4e2f-98ac-5411f66d8352\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.501968 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-service-ca\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.502227 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-config\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.502236 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-ca\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.504103 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.505922 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5bcd665d-3ce2-437e-abcd-43175c1395c8-etcd-client\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.506278 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3d21106f-1d62-4e2f-98ac-5411f66d8352-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nnbzm\" (UID: \"3d21106f-1d62-4e2f-98ac-5411f66d8352\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.506279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bcd665d-3ce2-437e-abcd-43175c1395c8-serving-cert\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.506410 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d862f1c7-a1fa-483d-8fd7-9b7e7ef51568-srv-cert\") pod \"catalog-operator-68c6474976-lpqtr\" (UID: \"d862f1c7-a1fa-483d-8fd7-9b7e7ef51568\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.523160 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.544158 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.598076 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv2rp\" (UniqueName: \"kubernetes.io/projected/10a607f0-e1ef-405d-9771-54076793d426-kube-api-access-hv2rp\") pod \"controller-manager-879f6c89f-4sh2x\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.623565 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckx4f\" (UniqueName: \"kubernetes.io/projected/41c5c97b-d54b-4770-bc40-af3149d25304-kube-api-access-ckx4f\") pod \"machine-approver-56656f9798-vf689\" (UID: \"41c5c97b-d54b-4770-bc40-af3149d25304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.625234 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.648330 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbrlh\" (UniqueName: \"kubernetes.io/projected/21e078d9-a539-4626-b30f-908b8e866a7a-kube-api-access-gbrlh\") pod \"oauth-openshift-558db77b4-ln57b\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.653760 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:39 crc kubenswrapper[4778]: W1205 15:57:39.653990 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41c5c97b_d54b_4770_bc40_af3149d25304.slice/crio-ebb3ed50f45fd1a2a2176880e5875e29d81501fe0c617082b8a991ed2c1b4444 WatchSource:0}: Error finding container ebb3ed50f45fd1a2a2176880e5875e29d81501fe0c617082b8a991ed2c1b4444: Status 404 returned error can't find the container with id ebb3ed50f45fd1a2a2176880e5875e29d81501fe0c617082b8a991ed2c1b4444 Dec 05 15:57:39 crc kubenswrapper[4778]: W1205 15:57:39.658180 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-46c40553d4c32a21afb8a18d15e9f63697cd4bd0726bfac6a886450c57add1f8 WatchSource:0}: Error finding container 46c40553d4c32a21afb8a18d15e9f63697cd4bd0726bfac6a886450c57add1f8: Status 404 returned error can't find the container with id 46c40553d4c32a21afb8a18d15e9f63697cd4bd0726bfac6a886450c57add1f8 Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.660115 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwxv5\" (UniqueName: \"kubernetes.io/projected/01fba0d7-2372-4976-8ccc-4a4b15ff2fb5-kube-api-access-qwxv5\") pod \"apiserver-7bbb656c7d-cp4xg\" (UID: \"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.683528 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw8l6\" (UniqueName: \"kubernetes.io/projected/50ea62f2-f338-4164-8ecc-3d7d777c0d43-kube-api-access-gw8l6\") pod \"authentication-operator-69f744f599-kxmn5\" (UID: \"50ea62f2-f338-4164-8ecc-3d7d777c0d43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.700619 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rbqx\" (UniqueName: \"kubernetes.io/projected/782cf494-3079-47fe-8f6c-f7d5731a5b69-kube-api-access-5rbqx\") pod \"route-controller-manager-6576b87f9c-6jmzh\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.701920 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.716608 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxhk9\" (UniqueName: \"kubernetes.io/projected/ea8ad42e-dfd1-486c-85c1-d4ff1bb95707-kube-api-access-lxhk9\") pod \"downloads-7954f5f757-smxt5\" (UID: \"ea8ad42e-dfd1-486c-85c1-d4ff1bb95707\") " pod="openshift-console/downloads-7954f5f757-smxt5" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.725512 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.750917 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-smxt5" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.763650 4778 request.go:700] Waited for 1.892200142s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.765642 4778 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.769578 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtvz4\" (UniqueName: \"kubernetes.io/projected/6b16e394-6692-4df4-ad2c-5163e126b448-kube-api-access-dtvz4\") pod \"cluster-samples-operator-665b6dd947-56shj\" (UID: \"6b16e394-6692-4df4-ad2c-5163e126b448\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.784277 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.826233 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2lc\" (UniqueName: \"kubernetes.io/projected/26f36385-887c-4459-9c97-b1fb8f8d1d26-kube-api-access-vr2lc\") pod \"openshift-apiserver-operator-796bbdcf4f-psmvp\" (UID: \"26f36385-887c-4459-9c97-b1fb8f8d1d26\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.839568 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.843798 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2c52\" (UniqueName: \"kubernetes.io/projected/cfbfb374-e786-4e49-8c80-54ec12c7abcc-kube-api-access-c2c52\") pod \"apiserver-76f77b778f-ffqnf\" (UID: \"cfbfb374-e786-4e49-8c80-54ec12c7abcc\") " pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.853852 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ln57b"] Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.857583 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65kc6\" (UniqueName: \"kubernetes.io/projected/f859bdf8-f651-407a-a6b8-6c3ae2fe7f63-kube-api-access-65kc6\") pod \"machine-api-operator-5694c8668f-27smd\" (UID: \"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.867483 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.884934 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfl8h\" (UniqueName: \"kubernetes.io/projected/b8d6b1ed-75bd-4c5a-ab2d-294b7359301d-kube-api-access-mfl8h\") pod \"openshift-config-operator-7777fb866f-5bl2q\" (UID: \"b8d6b1ed-75bd-4c5a-ab2d-294b7359301d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.903258 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg8g8\" (UniqueName: \"kubernetes.io/projected/11591472-458c-43dc-b51b-2b15987291a0-kube-api-access-kg8g8\") pod \"dns-operator-744455d44c-mw82h\" (UID: \"11591472-458c-43dc-b51b-2b15987291a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-mw82h" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.918248 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.918735 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.922085 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhh9b\" (UniqueName: \"kubernetes.io/projected/0c134aff-5bc5-4901-8746-5f79fb395b01-kube-api-access-bhh9b\") pod \"console-f9d7485db-gnnls\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.931621 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.938333 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.944447 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.957784 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21c410dc-98d6-4319-9bae-e4025e9fdbb5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-znzwx\" (UID: \"21c410dc-98d6-4319-9bae-e4025e9fdbb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.961853 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5snd\" (UniqueName: \"kubernetes.io/projected/9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61-kube-api-access-m5snd\") pod \"openshift-controller-manager-operator-756b6f6bc6-ffn5l\" (UID: \"9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.978433 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvjct\" (UniqueName: \"kubernetes.io/projected/66a3882a-e9bc-40d4-b51f-e47d9354f53a-kube-api-access-nvjct\") pod \"marketplace-operator-79b997595-ptlw8\" (UID: \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.982066 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kxmn5"] Dec 05 15:57:39 crc kubenswrapper[4778]: I1205 15:57:39.995746 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-smxt5"] Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:39.999951 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ff9de8-25f7-408a-834b-ec4edf3c98e5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t72kw\" (UID: \"e9ff9de8-25f7-408a-834b-ec4edf3c98e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.019952 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7zpq\" (UniqueName: \"kubernetes.io/projected/5bcd665d-3ce2-437e-abcd-43175c1395c8-kube-api-access-b7zpq\") pod \"etcd-operator-b45778765-f26tf\" (UID: \"5bcd665d-3ce2-437e-abcd-43175c1395c8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.026908 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.042278 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm8s4\" (UniqueName: \"kubernetes.io/projected/168fbd70-f065-4a1d-965a-c1d67493a528-kube-api-access-wm8s4\") pod \"router-default-5444994796-zzpst\" (UID: \"168fbd70-f065-4a1d-965a-c1d67493a528\") " pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.045997 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.054160 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.058400 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p8rw\" (UniqueName: \"kubernetes.io/projected/3d21106f-1d62-4e2f-98ac-5411f66d8352-kube-api-access-2p8rw\") pod \"multus-admission-controller-857f4d67dd-nnbzm\" (UID: \"3d21106f-1d62-4e2f-98ac-5411f66d8352\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.062685 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.062813 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mw82h" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.069262 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh"] Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.079027 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r6r8\" (UID: \"d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.093802 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.101357 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpwtv\" (UniqueName: \"kubernetes.io/projected/2f84a125-f112-4f6a-ae37-63cf387032c7-kube-api-access-qpwtv\") pod \"cluster-image-registry-operator-dc59b4c8b-fv48p\" (UID: \"2f84a125-f112-4f6a-ae37-63cf387032c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.120663 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt8t6\" (UniqueName: \"kubernetes.io/projected/e9ff9de8-25f7-408a-834b-ec4edf3c98e5-kube-api-access-dt8t6\") pod \"ingress-operator-5b745b69d9-t72kw\" (UID: \"e9ff9de8-25f7-408a-834b-ec4edf3c98e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.144844 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"68a259e73dbd09aac550b737c8551cef124451f67b8eeba16a9606acdfed0355"} Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.144894 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"46c40553d4c32a21afb8a18d15e9f63697cd4bd0726bfac6a886450c57add1f8"} Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.150853 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" event={"ID":"21e078d9-a539-4626-b30f-908b8e866a7a","Type":"ContainerStarted","Data":"ba940eec518780bc4b7c9d0da0af3f0530c8b3211a25d1df1d64bb2c539478db"} Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.150912 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zg62\" (UniqueName: \"kubernetes.io/projected/cca81879-8d18-4463-8db2-b17f8c24874c-kube-api-access-4zg62\") pod \"machine-config-operator-74547568cd-rbbgx\" (UID: \"cca81879-8d18-4463-8db2-b17f8c24874c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.150954 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4sh2x"] Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.152621 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.152972 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b02435e3f62fe6f1e0ecfb1cfb9e10b9777075df8d4e18b0471df6111ca040a6"} Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.152997 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a16c8ef4704c3c9f82907e2fcb4e3e045cf91974df64865e781157656c52ddf5"} Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.153311 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.153738 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.155215 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" event={"ID":"50ea62f2-f338-4164-8ecc-3d7d777c0d43","Type":"ContainerStarted","Data":"70ca7400e587787e5e2813ac5857d0f6f611607021c8ab3f8ce15600a4deecf3"} Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.158858 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78nbs\" (UniqueName: \"kubernetes.io/projected/9093abb1-e6d3-48d8-972a-88c8f8ec9fe2-kube-api-access-78nbs\") pod \"package-server-manager-789f6589d5-4mr8q\" (UID: \"9093abb1-e6d3-48d8-972a-88c8f8ec9fe2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.165303 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1a4eacc60bff868a26277bed9107fd68977cd5cf7e19a53a04927ce72bc36faa"} Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.165397 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ce62df28adc1510b77a975b9fe0941b6617dbf27e32300d5277d0b4a936d8bc5"} Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.172342 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" event={"ID":"41c5c97b-d54b-4770-bc40-af3149d25304","Type":"ContainerStarted","Data":"01cb42fdc9d15062732c6835d2715c948f4d5bfd82480020c0c218f007131ff6"} Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.172392 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" event={"ID":"41c5c97b-d54b-4770-bc40-af3149d25304","Type":"ContainerStarted","Data":"ebb3ed50f45fd1a2a2176880e5875e29d81501fe0c617082b8a991ed2c1b4444"} Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.173479 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-smxt5" event={"ID":"ea8ad42e-dfd1-486c-85c1-d4ff1bb95707","Type":"ContainerStarted","Data":"b939e9c98a394cbeaed581d9f60fc30e489c281ac33ba8c2f6de24bb9dd56bfd"} Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.179411 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.181476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g5b9\" (UniqueName: \"kubernetes.io/projected/d862f1c7-a1fa-483d-8fd7-9b7e7ef51568-kube-api-access-4g5b9\") pod \"catalog-operator-68c6474976-lpqtr\" (UID: \"d862f1c7-a1fa-483d-8fd7-9b7e7ef51568\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.197928 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghshr\" (UniqueName: \"kubernetes.io/projected/6a0bc7a3-053e-495b-ac9f-21322dace59d-kube-api-access-ghshr\") pod \"olm-operator-6b444d44fb-8xvnf\" (UID: \"6a0bc7a3-053e-495b-ac9f-21322dace59d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:40 crc kubenswrapper[4778]: W1205 15:57:40.224484 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10a607f0_e1ef_405d_9771_54076793d426.slice/crio-165e85a7da479e2ff22022cad2f2b3a35405f6d6fe21fb4b6b33d22e15700323 WatchSource:0}: Error finding container 165e85a7da479e2ff22022cad2f2b3a35405f6d6fe21fb4b6b33d22e15700323: Status 404 returned error can't find the container with id 165e85a7da479e2ff22022cad2f2b3a35405f6d6fe21fb4b6b33d22e15700323 Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.224896 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frhpq\" (UniqueName: \"kubernetes.io/projected/b2b86f96-71d0-4398-8963-5ad320ad5f2f-kube-api-access-frhpq\") pod \"kube-storage-version-migrator-operator-b67b599dd-vz6vk\" (UID: \"b2b86f96-71d0-4398-8963-5ad320ad5f2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.245704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f84a125-f112-4f6a-ae37-63cf387032c7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fv48p\" (UID: \"2f84a125-f112-4f6a-ae37-63cf387032c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.261177 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h274\" (UniqueName: \"kubernetes.io/projected/cac1e607-7735-4696-9666-34cb5ecb4857-kube-api-access-9h274\") pod \"control-plane-machine-set-operator-78cbb6b69f-8gjtp\" (UID: \"cac1e607-7735-4696-9666-34cb5ecb4857\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.316009 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.317344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg6jc\" (UniqueName: \"kubernetes.io/projected/a91f7a18-161e-4089-bc09-835d5b33b65f-kube-api-access-jg6jc\") pod \"console-operator-58897d9998-jfv5q\" (UID: \"a91f7a18-161e-4089-bc09-835d5b33b65f\") " pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.317855 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.322981 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.340386 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.340507 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhkfr\" (UniqueName: \"kubernetes.io/projected/a0b2c74a-c566-4614-aa26-65f67ae6fc94-kube-api-access-nhkfr\") pod \"machine-config-controller-84d6567774-w929v\" (UID: \"a0b2c74a-c566-4614-aa26-65f67ae6fc94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.342144 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05baa2fe-0ea7-41b3-9e70-04412e0e5658-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nfqps\" (UID: \"05baa2fe-0ea7-41b3-9e70-04412e0e5658\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.346720 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4mc5\" (UniqueName: \"kubernetes.io/projected/e1235317-93be-4e63-b980-259191d32b82-kube-api-access-d4mc5\") pod \"migrator-59844c95c7-ndj72\" (UID: \"e1235317-93be-4e63-b980-259191d32b82\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ndj72" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.351556 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg"] Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.367487 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.390929 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.399745 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.403779 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.413746 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ndj72" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418452 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/26dc74da-e40d-4974-be9b-4f25f1eb66e7-apiservice-cert\") pod \"packageserver-d55dfcdfc-zhb7w\" (UID: \"26dc74da-e40d-4974-be9b-4f25f1eb66e7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418497 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00bbab66-fa02-4505-8afb-d9d9c1370d95-secret-volume\") pod \"collect-profiles-29415825-9mpnd\" (UID: \"00bbab66-fa02-4505-8afb-d9d9c1370d95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418520 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-registry-tls\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418546 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz2lx\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-kube-api-access-dz2lx\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418564 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e0a0bdff-cebb-4207-a8d2-9b53eb9a1886-node-bootstrap-token\") pod \"machine-config-server-sb2zn\" (UID: \"e0a0bdff-cebb-4207-a8d2-9b53eb9a1886\") " pod="openshift-machine-config-operator/machine-config-server-sb2zn" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418580 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgq2b\" (UniqueName: \"kubernetes.io/projected/00bbab66-fa02-4505-8afb-d9d9c1370d95-kube-api-access-qgq2b\") pod \"collect-profiles-29415825-9mpnd\" (UID: \"00bbab66-fa02-4505-8afb-d9d9c1370d95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418596 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e0a0bdff-cebb-4207-a8d2-9b53eb9a1886-certs\") pod \"machine-config-server-sb2zn\" (UID: \"e0a0bdff-cebb-4207-a8d2-9b53eb9a1886\") " pod="openshift-machine-config-operator/machine-config-server-sb2zn" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418611 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xmj9\" (UniqueName: \"kubernetes.io/projected/a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d-kube-api-access-5xmj9\") pod \"service-ca-operator-777779d784-z5vlr\" (UID: \"a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418647 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbnk8\" (UniqueName: \"kubernetes.io/projected/26dc74da-e40d-4974-be9b-4f25f1eb66e7-kube-api-access-mbnk8\") pod \"packageserver-d55dfcdfc-zhb7w\" (UID: \"26dc74da-e40d-4974-be9b-4f25f1eb66e7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418662 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d0c5c29-5367-41e6-be46-e23a9ac5e281-registry-certificates\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418687 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00bbab66-fa02-4505-8afb-d9d9c1370d95-config-volume\") pod \"collect-profiles-29415825-9mpnd\" (UID: \"00bbab66-fa02-4505-8afb-d9d9c1370d95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418711 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d-config\") pod \"service-ca-operator-777779d784-z5vlr\" (UID: \"a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418725 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjr6c\" (UniqueName: \"kubernetes.io/projected/f9012bc6-8bf5-4d46-bd75-6c7f0b68c758-kube-api-access-cjr6c\") pod \"ingress-canary-qdnqc\" (UID: \"f9012bc6-8bf5-4d46-bd75-6c7f0b68c758\") " pod="openshift-ingress-canary/ingress-canary-qdnqc" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418802 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/615fe1ea-d314-4611-bc3f-198d641d4fb5-signing-cabundle\") pod \"service-ca-9c57cc56f-hshsw\" (UID: \"615fe1ea-d314-4611-bc3f-198d641d4fb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418817 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/26dc74da-e40d-4974-be9b-4f25f1eb66e7-webhook-cert\") pod \"packageserver-d55dfcdfc-zhb7w\" (UID: \"26dc74da-e40d-4974-be9b-4f25f1eb66e7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418859 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/615fe1ea-d314-4611-bc3f-198d641d4fb5-signing-key\") pod \"service-ca-9c57cc56f-hshsw\" (UID: \"615fe1ea-d314-4611-bc3f-198d641d4fb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418925 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d0c5c29-5367-41e6-be46-e23a9ac5e281-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418973 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9012bc6-8bf5-4d46-bd75-6c7f0b68c758-cert\") pod \"ingress-canary-qdnqc\" (UID: \"f9012bc6-8bf5-4d46-bd75-6c7f0b68c758\") " pod="openshift-ingress-canary/ingress-canary-qdnqc" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.418987 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmzzv\" (UniqueName: \"kubernetes.io/projected/e0a0bdff-cebb-4207-a8d2-9b53eb9a1886-kube-api-access-mmzzv\") pod \"machine-config-server-sb2zn\" (UID: \"e0a0bdff-cebb-4207-a8d2-9b53eb9a1886\") " pod="openshift-machine-config-operator/machine-config-server-sb2zn" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.419013 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db6e2e26-e34f-46e8-a5fe-b25a12930d39-config-volume\") pod \"dns-default-nf9v5\" (UID: \"db6e2e26-e34f-46e8-a5fe-b25a12930d39\") " pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.419034 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.419059 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-bound-sa-token\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.419073 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd2tc\" (UniqueName: \"kubernetes.io/projected/615fe1ea-d314-4611-bc3f-198d641d4fb5-kube-api-access-sd2tc\") pod \"service-ca-9c57cc56f-hshsw\" (UID: \"615fe1ea-d314-4611-bc3f-198d641d4fb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.419086 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db6e2e26-e34f-46e8-a5fe-b25a12930d39-metrics-tls\") pod \"dns-default-nf9v5\" (UID: \"db6e2e26-e34f-46e8-a5fe-b25a12930d39\") " pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.419121 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d0c5c29-5367-41e6-be46-e23a9ac5e281-trusted-ca\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.419155 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d-serving-cert\") pod \"service-ca-operator-777779d784-z5vlr\" (UID: \"a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.419222 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfpwr\" (UniqueName: \"kubernetes.io/projected/db6e2e26-e34f-46e8-a5fe-b25a12930d39-kube-api-access-hfpwr\") pod \"dns-default-nf9v5\" (UID: \"db6e2e26-e34f-46e8-a5fe-b25a12930d39\") " pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.419248 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d0c5c29-5367-41e6-be46-e23a9ac5e281-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.419288 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/26dc74da-e40d-4974-be9b-4f25f1eb66e7-tmpfs\") pod \"packageserver-d55dfcdfc-zhb7w\" (UID: \"26dc74da-e40d-4974-be9b-4f25f1eb66e7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: E1205 15:57:40.422594 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:40.922580634 +0000 UTC m=+148.026377014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.427702 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.429634 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.471287 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:40 crc kubenswrapper[4778]: W1205 15:57:40.471756 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01fba0d7_2372_4976_8ccc_4a4b15ff2fb5.slice/crio-5dec94c3d4dfbd1fc644a7bc022a4a15f161dc846a1a2db547aedb74ea4a44f1 WatchSource:0}: Error finding container 5dec94c3d4dfbd1fc644a7bc022a4a15f161dc846a1a2db547aedb74ea4a44f1: Status 404 returned error can't find the container with id 5dec94c3d4dfbd1fc644a7bc022a4a15f161dc846a1a2db547aedb74ea4a44f1 Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520046 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520401 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/615fe1ea-d314-4611-bc3f-198d641d4fb5-signing-key\") pod \"service-ca-9c57cc56f-hshsw\" (UID: \"615fe1ea-d314-4611-bc3f-198d641d4fb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520462 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-socket-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520510 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d0c5c29-5367-41e6-be46-e23a9ac5e281-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520597 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9012bc6-8bf5-4d46-bd75-6c7f0b68c758-cert\") pod \"ingress-canary-qdnqc\" (UID: \"f9012bc6-8bf5-4d46-bd75-6c7f0b68c758\") " pod="openshift-ingress-canary/ingress-canary-qdnqc" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520656 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmzzv\" (UniqueName: \"kubernetes.io/projected/e0a0bdff-cebb-4207-a8d2-9b53eb9a1886-kube-api-access-mmzzv\") pod \"machine-config-server-sb2zn\" (UID: \"e0a0bdff-cebb-4207-a8d2-9b53eb9a1886\") " pod="openshift-machine-config-operator/machine-config-server-sb2zn" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520736 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db6e2e26-e34f-46e8-a5fe-b25a12930d39-config-volume\") pod \"dns-default-nf9v5\" (UID: \"db6e2e26-e34f-46e8-a5fe-b25a12930d39\") " pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520786 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-plugins-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520813 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-bound-sa-token\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520837 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd2tc\" (UniqueName: \"kubernetes.io/projected/615fe1ea-d314-4611-bc3f-198d641d4fb5-kube-api-access-sd2tc\") pod \"service-ca-9c57cc56f-hshsw\" (UID: \"615fe1ea-d314-4611-bc3f-198d641d4fb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520858 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db6e2e26-e34f-46e8-a5fe-b25a12930d39-metrics-tls\") pod \"dns-default-nf9v5\" (UID: \"db6e2e26-e34f-46e8-a5fe-b25a12930d39\") " pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520930 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d0c5c29-5367-41e6-be46-e23a9ac5e281-trusted-ca\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.520978 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d-serving-cert\") pod \"service-ca-operator-777779d784-z5vlr\" (UID: \"a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-mountpoint-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521105 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-csi-data-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521154 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfpwr\" (UniqueName: \"kubernetes.io/projected/db6e2e26-e34f-46e8-a5fe-b25a12930d39-kube-api-access-hfpwr\") pod \"dns-default-nf9v5\" (UID: \"db6e2e26-e34f-46e8-a5fe-b25a12930d39\") " pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521216 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d0c5c29-5367-41e6-be46-e23a9ac5e281-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521278 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/26dc74da-e40d-4974-be9b-4f25f1eb66e7-tmpfs\") pod \"packageserver-d55dfcdfc-zhb7w\" (UID: \"26dc74da-e40d-4974-be9b-4f25f1eb66e7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521343 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/26dc74da-e40d-4974-be9b-4f25f1eb66e7-apiservice-cert\") pod \"packageserver-d55dfcdfc-zhb7w\" (UID: \"26dc74da-e40d-4974-be9b-4f25f1eb66e7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521410 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00bbab66-fa02-4505-8afb-d9d9c1370d95-secret-volume\") pod \"collect-profiles-29415825-9mpnd\" (UID: \"00bbab66-fa02-4505-8afb-d9d9c1370d95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521462 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-registry-tls\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521486 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz2lx\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-kube-api-access-dz2lx\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521521 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e0a0bdff-cebb-4207-a8d2-9b53eb9a1886-node-bootstrap-token\") pod \"machine-config-server-sb2zn\" (UID: \"e0a0bdff-cebb-4207-a8d2-9b53eb9a1886\") " pod="openshift-machine-config-operator/machine-config-server-sb2zn" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521545 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgq2b\" (UniqueName: \"kubernetes.io/projected/00bbab66-fa02-4505-8afb-d9d9c1370d95-kube-api-access-qgq2b\") pod \"collect-profiles-29415825-9mpnd\" (UID: \"00bbab66-fa02-4505-8afb-d9d9c1370d95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521592 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e0a0bdff-cebb-4207-a8d2-9b53eb9a1886-certs\") pod \"machine-config-server-sb2zn\" (UID: \"e0a0bdff-cebb-4207-a8d2-9b53eb9a1886\") " pod="openshift-machine-config-operator/machine-config-server-sb2zn" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521615 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmhvl\" (UniqueName: \"kubernetes.io/projected/4fdf5c65-8d99-48be-9033-c8df3a8afdea-kube-api-access-cmhvl\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521640 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xmj9\" (UniqueName: \"kubernetes.io/projected/a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d-kube-api-access-5xmj9\") pod \"service-ca-operator-777779d784-z5vlr\" (UID: \"a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521717 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-registration-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521798 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbnk8\" (UniqueName: \"kubernetes.io/projected/26dc74da-e40d-4974-be9b-4f25f1eb66e7-kube-api-access-mbnk8\") pod \"packageserver-d55dfcdfc-zhb7w\" (UID: \"26dc74da-e40d-4974-be9b-4f25f1eb66e7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521830 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d0c5c29-5367-41e6-be46-e23a9ac5e281-registry-certificates\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521857 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00bbab66-fa02-4505-8afb-d9d9c1370d95-config-volume\") pod \"collect-profiles-29415825-9mpnd\" (UID: \"00bbab66-fa02-4505-8afb-d9d9c1370d95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521894 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d-config\") pod \"service-ca-operator-777779d784-z5vlr\" (UID: \"a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.521916 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjr6c\" (UniqueName: \"kubernetes.io/projected/f9012bc6-8bf5-4d46-bd75-6c7f0b68c758-kube-api-access-cjr6c\") pod \"ingress-canary-qdnqc\" (UID: \"f9012bc6-8bf5-4d46-bd75-6c7f0b68c758\") " pod="openshift-ingress-canary/ingress-canary-qdnqc" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.522022 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/615fe1ea-d314-4611-bc3f-198d641d4fb5-signing-cabundle\") pod \"service-ca-9c57cc56f-hshsw\" (UID: \"615fe1ea-d314-4611-bc3f-198d641d4fb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.522055 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/26dc74da-e40d-4974-be9b-4f25f1eb66e7-webhook-cert\") pod \"packageserver-d55dfcdfc-zhb7w\" (UID: \"26dc74da-e40d-4974-be9b-4f25f1eb66e7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: E1205 15:57:40.522887 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:41.022866426 +0000 UTC m=+148.126662806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.528191 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/26dc74da-e40d-4974-be9b-4f25f1eb66e7-tmpfs\") pod \"packageserver-d55dfcdfc-zhb7w\" (UID: \"26dc74da-e40d-4974-be9b-4f25f1eb66e7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.529171 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00bbab66-fa02-4505-8afb-d9d9c1370d95-config-volume\") pod \"collect-profiles-29415825-9mpnd\" (UID: \"00bbab66-fa02-4505-8afb-d9d9c1370d95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.530356 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d0c5c29-5367-41e6-be46-e23a9ac5e281-trusted-ca\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.530872 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d0c5c29-5367-41e6-be46-e23a9ac5e281-registry-certificates\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.532474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d-config\") pod \"service-ca-operator-777779d784-z5vlr\" (UID: \"a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.534503 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e0a0bdff-cebb-4207-a8d2-9b53eb9a1886-certs\") pod \"machine-config-server-sb2zn\" (UID: \"e0a0bdff-cebb-4207-a8d2-9b53eb9a1886\") " pod="openshift-machine-config-operator/machine-config-server-sb2zn" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.535331 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d0c5c29-5367-41e6-be46-e23a9ac5e281-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.535559 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db6e2e26-e34f-46e8-a5fe-b25a12930d39-config-volume\") pod \"dns-default-nf9v5\" (UID: \"db6e2e26-e34f-46e8-a5fe-b25a12930d39\") " pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.537309 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db6e2e26-e34f-46e8-a5fe-b25a12930d39-metrics-tls\") pod \"dns-default-nf9v5\" (UID: \"db6e2e26-e34f-46e8-a5fe-b25a12930d39\") " pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.538383 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/615fe1ea-d314-4611-bc3f-198d641d4fb5-signing-cabundle\") pod \"service-ca-9c57cc56f-hshsw\" (UID: \"615fe1ea-d314-4611-bc3f-198d641d4fb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.547844 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d0c5c29-5367-41e6-be46-e23a9ac5e281-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.548289 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-registry-tls\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.549771 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e0a0bdff-cebb-4207-a8d2-9b53eb9a1886-node-bootstrap-token\") pod \"machine-config-server-sb2zn\" (UID: \"e0a0bdff-cebb-4207-a8d2-9b53eb9a1886\") " pod="openshift-machine-config-operator/machine-config-server-sb2zn" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.550665 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d-serving-cert\") pod \"service-ca-operator-777779d784-z5vlr\" (UID: \"a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.552962 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/615fe1ea-d314-4611-bc3f-198d641d4fb5-signing-key\") pod \"service-ca-9c57cc56f-hshsw\" (UID: \"615fe1ea-d314-4611-bc3f-198d641d4fb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.555978 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/26dc74da-e40d-4974-be9b-4f25f1eb66e7-apiservice-cert\") pod \"packageserver-d55dfcdfc-zhb7w\" (UID: \"26dc74da-e40d-4974-be9b-4f25f1eb66e7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.562134 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/26dc74da-e40d-4974-be9b-4f25f1eb66e7-webhook-cert\") pod \"packageserver-d55dfcdfc-zhb7w\" (UID: \"26dc74da-e40d-4974-be9b-4f25f1eb66e7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.562135 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00bbab66-fa02-4505-8afb-d9d9c1370d95-secret-volume\") pod \"collect-profiles-29415825-9mpnd\" (UID: \"00bbab66-fa02-4505-8afb-d9d9c1370d95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.562585 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9012bc6-8bf5-4d46-bd75-6c7f0b68c758-cert\") pod \"ingress-canary-qdnqc\" (UID: \"f9012bc6-8bf5-4d46-bd75-6c7f0b68c758\") " pod="openshift-ingress-canary/ingress-canary-qdnqc" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.586199 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz2lx\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-kube-api-access-dz2lx\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.587233 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffqnf"] Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.587845 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gnnls"] Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.590744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgq2b\" (UniqueName: \"kubernetes.io/projected/00bbab66-fa02-4505-8afb-d9d9c1370d95-kube-api-access-qgq2b\") pod \"collect-profiles-29415825-9mpnd\" (UID: \"00bbab66-fa02-4505-8afb-d9d9c1370d95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.592977 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj"] Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.609005 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbnk8\" (UniqueName: \"kubernetes.io/projected/26dc74da-e40d-4974-be9b-4f25f1eb66e7-kube-api-access-mbnk8\") pod \"packageserver-d55dfcdfc-zhb7w\" (UID: \"26dc74da-e40d-4974-be9b-4f25f1eb66e7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.623309 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-socket-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.623355 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.623393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-plugins-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.623432 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-mountpoint-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.623448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-csi-data-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.623482 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmhvl\" (UniqueName: \"kubernetes.io/projected/4fdf5c65-8d99-48be-9033-c8df3a8afdea-kube-api-access-cmhvl\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.623509 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-registration-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.623697 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-registration-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.623752 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-socket-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: E1205 15:57:40.623993 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:41.12398261 +0000 UTC m=+148.227778980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.624310 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-plugins-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.624305 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-csi-data-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.624352 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4fdf5c65-8d99-48be-9033-c8df3a8afdea-mountpoint-dir\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.626739 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjr6c\" (UniqueName: \"kubernetes.io/projected/f9012bc6-8bf5-4d46-bd75-6c7f0b68c758-kube-api-access-cjr6c\") pod \"ingress-canary-qdnqc\" (UID: \"f9012bc6-8bf5-4d46-bd75-6c7f0b68c758\") " pod="openshift-ingress-canary/ingress-canary-qdnqc" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.633179 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.648093 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xmj9\" (UniqueName: \"kubernetes.io/projected/a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d-kube-api-access-5xmj9\") pod \"service-ca-operator-777779d784-z5vlr\" (UID: \"a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.664295 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfpwr\" (UniqueName: \"kubernetes.io/projected/db6e2e26-e34f-46e8-a5fe-b25a12930d39-kube-api-access-hfpwr\") pod \"dns-default-nf9v5\" (UID: \"db6e2e26-e34f-46e8-a5fe-b25a12930d39\") " pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:40 crc kubenswrapper[4778]: W1205 15:57:40.675651 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c134aff_5bc5_4901_8746_5f79fb395b01.slice/crio-c0e8a92fd0d7c151ef97ecd39512369d823cc097223fcabd5447111ed66e61d2 WatchSource:0}: Error finding container c0e8a92fd0d7c151ef97ecd39512369d823cc097223fcabd5447111ed66e61d2: Status 404 returned error can't find the container with id c0e8a92fd0d7c151ef97ecd39512369d823cc097223fcabd5447111ed66e61d2 Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.691965 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmzzv\" (UniqueName: \"kubernetes.io/projected/e0a0bdff-cebb-4207-a8d2-9b53eb9a1886-kube-api-access-mmzzv\") pod \"machine-config-server-sb2zn\" (UID: \"e0a0bdff-cebb-4207-a8d2-9b53eb9a1886\") " pod="openshift-machine-config-operator/machine-config-server-sb2zn" Dec 05 15:57:40 crc kubenswrapper[4778]: W1205 15:57:40.704001 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfbfb374_e786_4e49_8c80_54ec12c7abcc.slice/crio-1734d9138fa6b3959f7b04d12885df2bad5c8d1cc7362f9da820f2555d99a1c0 WatchSource:0}: Error finding container 1734d9138fa6b3959f7b04d12885df2bad5c8d1cc7362f9da820f2555d99a1c0: Status 404 returned error can't find the container with id 1734d9138fa6b3959f7b04d12885df2bad5c8d1cc7362f9da820f2555d99a1c0 Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.711277 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-bound-sa-token\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.722747 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd2tc\" (UniqueName: \"kubernetes.io/projected/615fe1ea-d314-4611-bc3f-198d641d4fb5-kube-api-access-sd2tc\") pod \"service-ca-9c57cc56f-hshsw\" (UID: \"615fe1ea-d314-4611-bc3f-198d641d4fb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.724053 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:40 crc kubenswrapper[4778]: E1205 15:57:40.724266 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:41.224248711 +0000 UTC m=+148.328045091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.724477 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: E1205 15:57:40.724794 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:41.224778525 +0000 UTC m=+148.328574905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.736751 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.748068 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.769343 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.771451 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmhvl\" (UniqueName: \"kubernetes.io/projected/4fdf5c65-8d99-48be-9033-c8df3a8afdea-kube-api-access-cmhvl\") pod \"csi-hostpathplugin-bctw8\" (UID: \"4fdf5c65-8d99-48be-9033-c8df3a8afdea\") " pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.790781 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.797276 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qdnqc" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.798971 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx"] Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.803726 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp"] Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.806015 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.815681 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sb2zn" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.830347 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:40 crc kubenswrapper[4778]: E1205 15:57:40.830783 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:41.330765712 +0000 UTC m=+148.434562092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.835783 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q"] Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.836312 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bctw8" Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.938220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:40 crc kubenswrapper[4778]: E1205 15:57:40.939645 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:41.439630548 +0000 UTC m=+148.543426928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.949020 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-27smd"] Dec 05 15:57:40 crc kubenswrapper[4778]: I1205 15:57:40.966184 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l"] Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.040859 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:41 crc kubenswrapper[4778]: E1205 15:57:41.041232 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:41.541216365 +0000 UTC m=+148.645012745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.142756 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:41 crc kubenswrapper[4778]: E1205 15:57:41.143213 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:41.643196793 +0000 UTC m=+148.746993183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.186753 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" event={"ID":"50ea62f2-f338-4164-8ecc-3d7d777c0d43","Type":"ContainerStarted","Data":"41441b878cf19898db00c9102c68303cb44a88740d3b8e085e0b304c7c55ae88"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.191094 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zzpst" event={"ID":"168fbd70-f065-4a1d-965a-c1d67493a528","Type":"ContainerStarted","Data":"afa6d0b5eabeff5fe40f58cf633e29e1e214ebc5ac4c8bc03d6a5b1220fd6b18"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.191140 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zzpst" event={"ID":"168fbd70-f065-4a1d-965a-c1d67493a528","Type":"ContainerStarted","Data":"0225344adeb353da6e846509760cc893005882c746e2943584ed9b85e38cca3b"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.194205 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" event={"ID":"782cf494-3079-47fe-8f6c-f7d5731a5b69","Type":"ContainerStarted","Data":"c1782157bd3a3558c3268f513bbc8b6601d6c8ebed287c582554e9b40d194492"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.194251 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" event={"ID":"782cf494-3079-47fe-8f6c-f7d5731a5b69","Type":"ContainerStarted","Data":"7d89690147654bcae8e19db3ec7c3d41285c91f13fc30035523798dc3826440f"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.194496 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.196052 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj" event={"ID":"6b16e394-6692-4df4-ad2c-5163e126b448","Type":"ContainerStarted","Data":"2fd97aea6510fee182e395fcbadab88413ecd400be9541e7dea5418fcd15b5b9"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.198441 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" event={"ID":"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63","Type":"ContainerStarted","Data":"331e81f0b9554a804a556439d18e63ab3511d28b7615630ddff870912c9cf7ed"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.201297 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-smxt5" event={"ID":"ea8ad42e-dfd1-486c-85c1-d4ff1bb95707","Type":"ContainerStarted","Data":"b250b0aa28a197dec205bcf5daa81ca0188af85c095bc3ddc979b17649d773e7"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.201633 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-smxt5" Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.203045 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-smxt5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.203510 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-smxt5" podUID="ea8ad42e-dfd1-486c-85c1-d4ff1bb95707" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.207458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" event={"ID":"b8d6b1ed-75bd-4c5a-ab2d-294b7359301d","Type":"ContainerStarted","Data":"83d49077c657321be22b6eb2d76d2351309445d90a28cf6416399d323a185cf7"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.219820 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" event={"ID":"10a607f0-e1ef-405d-9771-54076793d426","Type":"ContainerStarted","Data":"8f812803cfe9ef7eef20d31d0db2188c48d3e35f9969ba0edf7062582b4508d6"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.219866 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" event={"ID":"10a607f0-e1ef-405d-9771-54076793d426","Type":"ContainerStarted","Data":"165e85a7da479e2ff22022cad2f2b3a35405f6d6fe21fb4b6b33d22e15700323"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.220423 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.224112 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" event={"ID":"41c5c97b-d54b-4770-bc40-af3149d25304","Type":"ContainerStarted","Data":"4fccc7906f817f674467d9a78d74e3491bf84e856f83a9e7950ceaef3ffea6e9"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.232476 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sb2zn" event={"ID":"e0a0bdff-cebb-4207-a8d2-9b53eb9a1886","Type":"ContainerStarted","Data":"0468cddd30989b86cdd059660bb08298a6a5794b73db24e5906280ae75190c04"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.235706 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" event={"ID":"21c410dc-98d6-4319-9bae-e4025e9fdbb5","Type":"ContainerStarted","Data":"1f3a7e4de34baca86624e2bd409346f48c02cc4f64c0a95c4c2cf8f946b8dcdf"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.237810 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" event={"ID":"26f36385-887c-4459-9c97-b1fb8f8d1d26","Type":"ContainerStarted","Data":"eccb17210c8ef0f84a561634b6f8df4a3f8fb053a90439dfff2c75560cb36a0e"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.241145 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" event={"ID":"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5","Type":"ContainerStarted","Data":"5dec94c3d4dfbd1fc644a7bc022a4a15f161dc846a1a2db547aedb74ea4a44f1"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.242729 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" event={"ID":"9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61","Type":"ContainerStarted","Data":"8a9b6ce9315cda0fd1c11ccfd2bab2a4ce6f06eba165216c54b34529d38a3956"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.243563 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:41 crc kubenswrapper[4778]: E1205 15:57:41.244005 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:41.743986368 +0000 UTC m=+148.847782748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.345275 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.348858 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gnnls" event={"ID":"0c134aff-5bc5-4901-8746-5f79fb395b01","Type":"ContainerStarted","Data":"c0e8a92fd0d7c151ef97ecd39512369d823cc097223fcabd5447111ed66e61d2"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.348930 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.348941 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" event={"ID":"cfbfb374-e786-4e49-8c80-54ec12c7abcc","Type":"ContainerStarted","Data":"1734d9138fa6b3959f7b04d12885df2bad5c8d1cc7362f9da820f2555d99a1c0"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.348953 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.348963 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" event={"ID":"21e078d9-a539-4626-b30f-908b8e866a7a","Type":"ContainerStarted","Data":"ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf"} Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.348987 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:57:41 crc kubenswrapper[4778]: E1205 15:57:41.349408 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:41.849391909 +0000 UTC m=+148.953188289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.448457 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:41 crc kubenswrapper[4778]: E1205 15:57:41.448702 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:41.948680083 +0000 UTC m=+149.052476473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.448958 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:41 crc kubenswrapper[4778]: E1205 15:57:41.449341 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:41.949330661 +0000 UTC m=+149.053127041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.509768 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.524527 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ptlw8"] Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.551057 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:41 crc kubenswrapper[4778]: E1205 15:57:41.551691 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:42.051659578 +0000 UTC m=+149.155455958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.594800 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-f26tf"] Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.597398 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mw82h"] Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.614859 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nnbzm"] Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.658140 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:41 crc kubenswrapper[4778]: E1205 15:57:41.658901 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:42.158886209 +0000 UTC m=+149.262682589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.732170 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-kxmn5" podStartSLOduration=129.732147242 podStartE2EDuration="2m9.732147242s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:41.730830845 +0000 UTC m=+148.834627225" watchObservedRunningTime="2025-12-05 15:57:41.732147242 +0000 UTC m=+148.835943622" Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.761990 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:41 crc kubenswrapper[4778]: E1205 15:57:41.762377 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:42.262343287 +0000 UTC m=+149.366139667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.864296 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p"] Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.867254 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:41 crc kubenswrapper[4778]: E1205 15:57:41.867635 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:42.367620434 +0000 UTC m=+149.471416814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.877997 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk"] Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.920660 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jfv5q"] Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.948842 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw"] Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.965803 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8"] Dec 05 15:57:41 crc kubenswrapper[4778]: I1205 15:57:41.968163 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:41 crc kubenswrapper[4778]: E1205 15:57:41.968453 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:42.4684349 +0000 UTC m=+149.572231280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.020152 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.049422 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.057713 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.057867 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:42 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:42 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:42 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.057907 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.081276 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:42 crc kubenswrapper[4778]: E1205 15:57:42.081635 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:42.581624065 +0000 UTC m=+149.685420445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.085017 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" podStartSLOduration=130.085002147 podStartE2EDuration="2m10.085002147s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:42.081761288 +0000 UTC m=+149.185557668" watchObservedRunningTime="2025-12-05 15:57:42.085002147 +0000 UTC m=+149.188798527" Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.113905 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w929v"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.142825 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.179862 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.184150 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:42 crc kubenswrapper[4778]: E1205 15:57:42.184445 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:42.684417014 +0000 UTC m=+149.788213394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.194122 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.195892 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.238704 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ndj72"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.251718 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-zzpst" podStartSLOduration=130.251700123 podStartE2EDuration="2m10.251700123s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:42.25158464 +0000 UTC m=+149.355381020" watchObservedRunningTime="2025-12-05 15:57:42.251700123 +0000 UTC m=+149.355496503" Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.257688 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.285851 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:42 crc kubenswrapper[4778]: E1205 15:57:42.286184 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:42.786173216 +0000 UTC m=+149.889969596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.325444 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" podStartSLOduration=130.325426099 podStartE2EDuration="2m10.325426099s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:42.32511636 +0000 UTC m=+149.428912740" watchObservedRunningTime="2025-12-05 15:57:42.325426099 +0000 UTC m=+149.429222479" Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.336029 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" event={"ID":"d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7","Type":"ContainerStarted","Data":"97108b2346cf0ba88c347ec6301bf865bed1b38931ebfd6a1daafdff51980d5b"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.337969 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" event={"ID":"5bcd665d-3ce2-437e-abcd-43175c1395c8","Type":"ContainerStarted","Data":"1a74971455eec96bce1a8cc76e1865d42161a4369a22997860a410e9d4154cce"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.346914 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sb2zn" event={"ID":"e0a0bdff-cebb-4207-a8d2-9b53eb9a1886","Type":"ContainerStarted","Data":"da325b08ea42154e63bbd8ddd3c3a645ad70c051c57db88d921cada411a842b6"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.357265 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hshsw"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.373014 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w"] Dec 05 15:57:42 crc kubenswrapper[4778]: W1205 15:57:42.373333 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1235317_93be_4e63_b980_259191d32b82.slice/crio-2faac12cc643c84b39b6c91759363b80f4a0d1a215ddde244cc74559461f72e0 WatchSource:0}: Error finding container 2faac12cc643c84b39b6c91759363b80f4a0d1a215ddde244cc74559461f72e0: Status 404 returned error can't find the container with id 2faac12cc643c84b39b6c91759363b80f4a0d1a215ddde244cc74559461f72e0 Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.374035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" event={"ID":"b2b86f96-71d0-4398-8963-5ad320ad5f2f","Type":"ContainerStarted","Data":"c7074ce0dbfc055fa63ad44edd9488fbad65f74d5bb4b305e42607b2a59043e9"} Dec 05 15:57:42 crc kubenswrapper[4778]: W1205 15:57:42.384692 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcca81879_8d18_4463_8db2_b17f8c24874c.slice/crio-193605c193a25726ad8b046ddbcdec93ba224fc20851f47fa0d7680076061f38 WatchSource:0}: Error finding container 193605c193a25726ad8b046ddbcdec93ba224fc20851f47fa0d7680076061f38: Status 404 returned error can't find the container with id 193605c193a25726ad8b046ddbcdec93ba224fc20851f47fa0d7680076061f38 Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.391402 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.391448 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" event={"ID":"a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d","Type":"ContainerStarted","Data":"bd0d6214bb6a19b3dd5043d825f465f5b559b21c094c7d4eabe759fc0c1159b0"} Dec 05 15:57:42 crc kubenswrapper[4778]: E1205 15:57:42.391680 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:42.891643118 +0000 UTC m=+149.995439498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.391961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:42 crc kubenswrapper[4778]: E1205 15:57:42.392267 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:42.892260195 +0000 UTC m=+149.996056575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.452744 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jfv5q" event={"ID":"a91f7a18-161e-4089-bc09-835d5b33b65f","Type":"ContainerStarted","Data":"255e1b5ffd76708c71c71ccc89f013252a86ece5ebaad81319e2dc7ccda87c74"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.462070 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" event={"ID":"2f84a125-f112-4f6a-ae37-63cf387032c7","Type":"ContainerStarted","Data":"64da1eda4ece8c3f5e499fcc09746373116c20b76b4e29bd194c32946bf72afc"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.471311 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" podStartSLOduration=129.471296406 podStartE2EDuration="2m9.471296406s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:42.438789457 +0000 UTC m=+149.542585837" watchObservedRunningTime="2025-12-05 15:57:42.471296406 +0000 UTC m=+149.575092776" Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.481332 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bctw8"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.485664 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" event={"ID":"9093abb1-e6d3-48d8-972a-88c8f8ec9fe2","Type":"ContainerStarted","Data":"0e34782bdf26ffeeaa44c4f7e9d650682078867420cd89ccd53f029629e4c92a"} Dec 05 15:57:42 crc kubenswrapper[4778]: W1205 15:57:42.485766 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26dc74da_e40d_4974_be9b_4f25f1eb66e7.slice/crio-9abaf68b8d54cb8c874631c2ddb0ddbe4d86d0e5eeea964a5ca29fbe87932668 WatchSource:0}: Error finding container 9abaf68b8d54cb8c874631c2ddb0ddbe4d86d0e5eeea964a5ca29fbe87932668: Status 404 returned error can't find the container with id 9abaf68b8d54cb8c874631c2ddb0ddbe4d86d0e5eeea964a5ca29fbe87932668 Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.492743 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:42 crc kubenswrapper[4778]: E1205 15:57:42.493066 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:42.99305146 +0000 UTC m=+150.096847830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.502148 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-smxt5" podStartSLOduration=130.502130459 podStartE2EDuration="2m10.502130459s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:42.501929764 +0000 UTC m=+149.605726134" watchObservedRunningTime="2025-12-05 15:57:42.502130459 +0000 UTC m=+149.605926829" Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.504712 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp" event={"ID":"cac1e607-7735-4696-9666-34cb5ecb4857","Type":"ContainerStarted","Data":"f6d8861b5c37f4b72d6f4224fc5f7963bc4ee26eb88a81fa2f0b0f6a3e809974"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.516851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj" event={"ID":"6b16e394-6692-4df4-ad2c-5163e126b448","Type":"ContainerStarted","Data":"cbc8553d7d76233927353e905f6dea9b52a3e8b43e034af28a2bbc4450d9794c"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.516901 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj" event={"ID":"6b16e394-6692-4df4-ad2c-5163e126b448","Type":"ContainerStarted","Data":"f4f86f09a7fc3646f114bfd4fe3e04c25a8763085cba5b3316cbb0ed57394cc7"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.518962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" event={"ID":"05baa2fe-0ea7-41b3-9e70-04412e0e5658","Type":"ContainerStarted","Data":"cdef4cb5034546dc497f4d17221b1f199dba5cbc6efe5e07d89add08b9bdf2b9"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.519785 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" event={"ID":"6a0bc7a3-053e-495b-ac9f-21322dace59d","Type":"ContainerStarted","Data":"99e61dc2a55e664e21d5379b90bcfcc65282e6b728c2c2837514468a32f5513e"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.522432 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" event={"ID":"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63","Type":"ContainerStarted","Data":"e4f9897452c5615036bc56ef5bf17d3271c12966177bcabc28db448455916dea"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.539373 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nf9v5"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.543847 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qdnqc"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.555183 4778 generic.go:334] "Generic (PLEG): container finished" podID="cfbfb374-e786-4e49-8c80-54ec12c7abcc" containerID="00c289be0811e189180317fdc1bacfaafae97bfea4652559395a0a26538fa64d" exitCode=0 Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.555738 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" event={"ID":"cfbfb374-e786-4e49-8c80-54ec12c7abcc","Type":"ContainerDied","Data":"00c289be0811e189180317fdc1bacfaafae97bfea4652559395a0a26538fa64d"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.578092 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vf689" podStartSLOduration=130.578070045 podStartE2EDuration="2m10.578070045s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:42.559776394 +0000 UTC m=+149.663572774" watchObservedRunningTime="2025-12-05 15:57:42.578070045 +0000 UTC m=+149.681866435" Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.582108 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd"] Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.595428 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:42 crc kubenswrapper[4778]: E1205 15:57:42.595879 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:43.095861261 +0000 UTC m=+150.199657641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.641871 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" event={"ID":"a0b2c74a-c566-4614-aa26-65f67ae6fc94","Type":"ContainerStarted","Data":"d83bb261b4159e06dfbe52610c5570f9b602a8ef710d3fc289d7713ddfc04734"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.690631 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" event={"ID":"d862f1c7-a1fa-483d-8fd7-9b7e7ef51568","Type":"ContainerStarted","Data":"0ea61b571c91027e4fcc9810a0acc403af97f393b03a660a473d508d00732953"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.696768 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:42 crc kubenswrapper[4778]: E1205 15:57:42.697225 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:43.197204491 +0000 UTC m=+150.301000921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.710165 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" podStartSLOduration=129.710148865 podStartE2EDuration="2m9.710148865s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:42.677661847 +0000 UTC m=+149.781458227" watchObservedRunningTime="2025-12-05 15:57:42.710148865 +0000 UTC m=+149.813945245" Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.711421 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-sb2zn" podStartSLOduration=5.711417749 podStartE2EDuration="5.711417749s" podCreationTimestamp="2025-12-05 15:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:42.709861227 +0000 UTC m=+149.813657607" watchObservedRunningTime="2025-12-05 15:57:42.711417749 +0000 UTC m=+149.815214129" Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.713850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mw82h" event={"ID":"11591472-458c-43dc-b51b-2b15987291a0","Type":"ContainerStarted","Data":"7257be3cb96c8f388328fbb5ad55123f07788c2fc0aac63958c94cf6763d1ed9"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.742841 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" event={"ID":"3d21106f-1d62-4e2f-98ac-5411f66d8352","Type":"ContainerStarted","Data":"9798a368855e3444b65bd98cbbf1e2264d6df029e8715f4a5f78e84fd0f86f9c"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.763471 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" event={"ID":"9cc17ecc-bbb7-465a-9f6f-7e7742c2ed61","Type":"ContainerStarted","Data":"769b2d7742481888d29f796053ae380ed279f27dbbd38eabba1039107ebfaea9"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.786801 4778 generic.go:334] "Generic (PLEG): container finished" podID="01fba0d7-2372-4976-8ccc-4a4b15ff2fb5" containerID="105178f057feee8bf0db1dc5cd86841e664180629fb93dfd1147df7c47b692c9" exitCode=0 Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.786902 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" event={"ID":"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5","Type":"ContainerDied","Data":"105178f057feee8bf0db1dc5cd86841e664180629fb93dfd1147df7c47b692c9"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.796559 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56shj" podStartSLOduration=130.796546047 podStartE2EDuration="2m10.796546047s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:42.790643806 +0000 UTC m=+149.894440176" watchObservedRunningTime="2025-12-05 15:57:42.796546047 +0000 UTC m=+149.900342427" Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.798995 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:42 crc kubenswrapper[4778]: E1205 15:57:42.799291 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:43.299279592 +0000 UTC m=+150.403075972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.815808 4778 generic.go:334] "Generic (PLEG): container finished" podID="b8d6b1ed-75bd-4c5a-ab2d-294b7359301d" containerID="932f4f6b602326a27c9918ae093015f48d66275446f10c23e60b3adecad27499" exitCode=0 Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.815900 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" event={"ID":"b8d6b1ed-75bd-4c5a-ab2d-294b7359301d","Type":"ContainerDied","Data":"932f4f6b602326a27c9918ae093015f48d66275446f10c23e60b3adecad27499"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.900274 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" event={"ID":"e9ff9de8-25f7-408a-834b-ec4edf3c98e5","Type":"ContainerStarted","Data":"9b6214953cd67aef7fa74bcb164a44dd802fabc3762a23a1bb7ccf3c28224f63"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.900800 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:42 crc kubenswrapper[4778]: E1205 15:57:42.901330 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:43.40131621 +0000 UTC m=+150.505112590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.939896 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" event={"ID":"21c410dc-98d6-4319-9bae-e4025e9fdbb5","Type":"ContainerStarted","Data":"fe37cd82d9e4fd0caac36bba2b1036e02fd418aad7c73558dfdd1e70c0c1379c"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.944749 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ffn5l" podStartSLOduration=130.944726787 podStartE2EDuration="2m10.944726787s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:42.944161331 +0000 UTC m=+150.047957711" watchObservedRunningTime="2025-12-05 15:57:42.944726787 +0000 UTC m=+150.048523187" Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.953770 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gnnls" event={"ID":"0c134aff-5bc5-4901-8746-5f79fb395b01","Type":"ContainerStarted","Data":"d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.988599 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" event={"ID":"26f36385-887c-4459-9c97-b1fb8f8d1d26","Type":"ContainerStarted","Data":"9cb5f9c971222f8aad6de63b959e38833b810fce83ef31beb2d2fe36c9a4fbe1"} Dec 05 15:57:42 crc kubenswrapper[4778]: I1205 15:57:42.991768 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-znzwx" podStartSLOduration=130.991756832 podStartE2EDuration="2m10.991756832s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:42.989560153 +0000 UTC m=+150.093356533" watchObservedRunningTime="2025-12-05 15:57:42.991756832 +0000 UTC m=+150.095553212" Dec 05 15:57:43 crc kubenswrapper[4778]: E1205 15:57:43.004913 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:43.504902162 +0000 UTC m=+150.608698542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.004940 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.015433 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" event={"ID":"66a3882a-e9bc-40d4-b51f-e47d9354f53a","Type":"ContainerStarted","Data":"e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9"} Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.015478 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" event={"ID":"66a3882a-e9bc-40d4-b51f-e47d9354f53a","Type":"ContainerStarted","Data":"85be95007696f2605993739d0fbe4ffbfc6892c36cc9a4e0af09cbafe93fe2a6"} Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.015491 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.016735 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-smxt5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.016781 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-smxt5" podUID="ea8ad42e-dfd1-486c-85c1-d4ff1bb95707" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.045054 4778 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ptlw8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.045111 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" podUID="66a3882a-e9bc-40d4-b51f-e47d9354f53a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.047783 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-psmvp" podStartSLOduration=131.047771633 podStartE2EDuration="2m11.047771633s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:43.045641006 +0000 UTC m=+150.149437406" watchObservedRunningTime="2025-12-05 15:57:43.047771633 +0000 UTC m=+150.151568013" Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.077773 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:43 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:43 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:43 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.077837 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.106395 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:43 crc kubenswrapper[4778]: E1205 15:57:43.106688 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:43.606666703 +0000 UTC m=+150.710463083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.107299 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:43 crc kubenswrapper[4778]: E1205 15:57:43.118215 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:43.618193749 +0000 UTC m=+150.721990129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.138503 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-gnnls" podStartSLOduration=131.138485604 podStartE2EDuration="2m11.138485604s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:43.078156904 +0000 UTC m=+150.181953294" watchObservedRunningTime="2025-12-05 15:57:43.138485604 +0000 UTC m=+150.242281984" Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.138938 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" podStartSLOduration=130.138930556 podStartE2EDuration="2m10.138930556s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:43.138228307 +0000 UTC m=+150.242024687" watchObservedRunningTime="2025-12-05 15:57:43.138930556 +0000 UTC m=+150.242726956" Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.211581 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:43 crc kubenswrapper[4778]: E1205 15:57:43.211848 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:43.711834308 +0000 UTC m=+150.815630688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.313681 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:43 crc kubenswrapper[4778]: E1205 15:57:43.314900 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:43.814539606 +0000 UTC m=+150.918335986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.419558 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:43 crc kubenswrapper[4778]: E1205 15:57:43.419691 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:43.919667599 +0000 UTC m=+151.023463979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.420402 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:43 crc kubenswrapper[4778]: E1205 15:57:43.420750 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:43.920740629 +0000 UTC m=+151.024537009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.529756 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:43 crc kubenswrapper[4778]: E1205 15:57:43.530070 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:44.030054817 +0000 UTC m=+151.133851187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.632039 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:43 crc kubenswrapper[4778]: E1205 15:57:43.633654 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:44.133642879 +0000 UTC m=+151.237439259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.735770 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:43 crc kubenswrapper[4778]: E1205 15:57:43.736143 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:44.23612922 +0000 UTC m=+151.339925590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.845029 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:43 crc kubenswrapper[4778]: E1205 15:57:43.845651 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:44.345629253 +0000 UTC m=+151.449425633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:43 crc kubenswrapper[4778]: I1205 15:57:43.945886 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:43 crc kubenswrapper[4778]: E1205 15:57:43.946166 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:44.446152391 +0000 UTC m=+151.549948771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.050747 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:44 crc kubenswrapper[4778]: E1205 15:57:44.051256 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:44.551239574 +0000 UTC m=+151.655035954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.064669 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:44 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:44 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:44 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.064757 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.087810 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-27smd" event={"ID":"f859bdf8-f651-407a-a6b8-6c3ae2fe7f63","Type":"ContainerStarted","Data":"61a54b86a62a6be7b2b8c6bd118b874860736e779f80f7b0bde86104461f384e"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.097058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" event={"ID":"2f84a125-f112-4f6a-ae37-63cf387032c7","Type":"ContainerStarted","Data":"8ef3e605638537825cd9c4d90611cafe7c71c52ae2e5781953f88709ab18086c"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.103619 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" event={"ID":"615fe1ea-d314-4611-bc3f-198d641d4fb5","Type":"ContainerStarted","Data":"8256665f57ce1688b4368ff719637a4fbde479dc437e297396931ce6db671024"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.103663 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" event={"ID":"615fe1ea-d314-4611-bc3f-198d641d4fb5","Type":"ContainerStarted","Data":"b0bc35d925beb449180660f0a9f557e366ddd69a2607bed4fe30b9d1930492cb"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.128130 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" event={"ID":"b2b86f96-71d0-4398-8963-5ad320ad5f2f","Type":"ContainerStarted","Data":"a23f77966d341f5e61633d9e78ec57a668f086be36615eff272be820fac125c1"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.131139 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" event={"ID":"a5b96dbe-bfe1-40dc-b4f1-bb2e21acdc1d","Type":"ContainerStarted","Data":"6cd63992d38f5bf6d9767f9adf3a7a3d8ee33db7008c4b1103edbd6c653e6947"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.151974 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:44 crc kubenswrapper[4778]: E1205 15:57:44.152329 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:44.652315067 +0000 UTC m=+151.756111447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.173561 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp" event={"ID":"cac1e607-7735-4696-9666-34cb5ecb4857","Type":"ContainerStarted","Data":"7a965dd044b0d58a081685b20621af7452ebad151b51e779ccbe6c5ba317bfe1"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.209849 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" event={"ID":"d0ca20d2-43d7-4f59-bb16-1f80d1fdb0c7","Type":"ContainerStarted","Data":"d345af97418b4c7b454aceda3971a0d0c584e7dacc5fea731207a4eb44784afb"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.242693 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" event={"ID":"00bbab66-fa02-4505-8afb-d9d9c1370d95","Type":"ContainerStarted","Data":"5ed5f18a843b48f50604e98b811e271036ea4e3626b486832ce9e95c37f718de"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.242752 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" event={"ID":"00bbab66-fa02-4505-8afb-d9d9c1370d95","Type":"ContainerStarted","Data":"b78999547d192ae133f8e845ec7f6ecba7e387b7a7967d268451d46446fc683d"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.261748 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:44 crc kubenswrapper[4778]: E1205 15:57:44.263259 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:44.763243608 +0000 UTC m=+151.867039988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.277525 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" event={"ID":"26dc74da-e40d-4974-be9b-4f25f1eb66e7","Type":"ContainerStarted","Data":"9abaf68b8d54cb8c874631c2ddb0ddbe4d86d0e5eeea964a5ca29fbe87932668"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.278425 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.288852 4778 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zhb7w container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.288911 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" podUID="26dc74da-e40d-4974-be9b-4f25f1eb66e7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.313844 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z5vlr" podStartSLOduration=131.313827051 podStartE2EDuration="2m11.313827051s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.297048412 +0000 UTC m=+151.400844792" watchObservedRunningTime="2025-12-05 15:57:44.313827051 +0000 UTC m=+151.417623431" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.315178 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-hshsw" podStartSLOduration=131.315173298 podStartE2EDuration="2m11.315173298s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.243857428 +0000 UTC m=+151.347653808" watchObservedRunningTime="2025-12-05 15:57:44.315173298 +0000 UTC m=+151.418969678" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.321196 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" event={"ID":"3d21106f-1d62-4e2f-98ac-5411f66d8352","Type":"ContainerStarted","Data":"178f97642ddea2fda3675ea641a5923c6118be2181b948e0949deffaf8d42b44"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.350075 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nf9v5" event={"ID":"db6e2e26-e34f-46e8-a5fe-b25a12930d39","Type":"ContainerStarted","Data":"4efdf0a47e126289b036c3892d8d4076fab6b497dbd5ed07a696d0ed90641d4a"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.368976 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:44 crc kubenswrapper[4778]: E1205 15:57:44.371157 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:44.871142408 +0000 UTC m=+151.974938788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.371304 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fv48p" podStartSLOduration=132.371288132 podStartE2EDuration="2m12.371288132s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.349727952 +0000 UTC m=+151.453524342" watchObservedRunningTime="2025-12-05 15:57:44.371288132 +0000 UTC m=+151.475084512" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.384894 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" event={"ID":"a0b2c74a-c566-4614-aa26-65f67ae6fc94","Type":"ContainerStarted","Data":"4646391c5538edd8fde801e9b7ddf0c176e0f70ba24b41e8828bcf4a6061d33c"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.430515 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" event={"ID":"e9ff9de8-25f7-408a-834b-ec4edf3c98e5","Type":"ContainerStarted","Data":"e755c9d55655da3b36b67a1e3ab7376b70651a07ceaa2dc0f778ddf06aa1a984"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.458544 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gjtp" podStartSLOduration=132.458525876 podStartE2EDuration="2m12.458525876s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.457291933 +0000 UTC m=+151.561088313" watchObservedRunningTime="2025-12-05 15:57:44.458525876 +0000 UTC m=+151.562322256" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.459163 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vz6vk" podStartSLOduration=132.459158024 podStartE2EDuration="2m12.459158024s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.424870117 +0000 UTC m=+151.528666497" watchObservedRunningTime="2025-12-05 15:57:44.459158024 +0000 UTC m=+151.562954404" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.475205 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:44 crc kubenswrapper[4778]: E1205 15:57:44.475531 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:44.975519611 +0000 UTC m=+152.079315991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.498248 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" event={"ID":"d862f1c7-a1fa-483d-8fd7-9b7e7ef51568","Type":"ContainerStarted","Data":"2962c5126b0e182cc5c3a11f6fd2402c9da5193ae17433f127c1381b640a49c8"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.498723 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.533756 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jfv5q" event={"ID":"a91f7a18-161e-4089-bc09-835d5b33b65f","Type":"ContainerStarted","Data":"436fa571a530bc32c298f644bab42e666d9ef0e319b11ee9fb4318517d36c68b"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.534738 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.555084 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ndj72" event={"ID":"e1235317-93be-4e63-b980-259191d32b82","Type":"ContainerStarted","Data":"0b3d6e837b07dcc8fe767e27c93dc8152d82fb331aefb741d36c5f70c3195f02"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.555155 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ndj72" event={"ID":"e1235317-93be-4e63-b980-259191d32b82","Type":"ContainerStarted","Data":"2faac12cc643c84b39b6c91759363b80f4a0d1a215ddde244cc74559461f72e0"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.557710 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.566256 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bctw8" event={"ID":"4fdf5c65-8d99-48be-9033-c8df3a8afdea","Type":"ContainerStarted","Data":"74423e8b33cc2d8b50d6a5721858e0c6fb5db299b5530037b12b655c4a12b4c2"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.576299 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r6r8" podStartSLOduration=132.576284795 podStartE2EDuration="2m12.576284795s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.527206284 +0000 UTC m=+151.631002664" watchObservedRunningTime="2025-12-05 15:57:44.576284795 +0000 UTC m=+151.680081175" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.578403 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:44 crc kubenswrapper[4778]: E1205 15:57:44.579528 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.079513893 +0000 UTC m=+152.183310273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.605294 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qdnqc" event={"ID":"f9012bc6-8bf5-4d46-bd75-6c7f0b68c758","Type":"ContainerStarted","Data":"55cabe7dd55a7a39ef8bf1eea0eac3778139936d4e9061828753ce420a0f521b"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.605348 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qdnqc" event={"ID":"f9012bc6-8bf5-4d46-bd75-6c7f0b68c758","Type":"ContainerStarted","Data":"7fa1aab8f987cec0825bbd18b8504c837bd413561a2647cf072818df6e70d47e"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.626969 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mw82h" event={"ID":"11591472-458c-43dc-b51b-2b15987291a0","Type":"ContainerStarted","Data":"07bf93ddc7c01e3166616e76fd20daff4f93aa39fff57db51677c9d493ec1b5f"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.652444 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" podStartSLOduration=132.652427397 podStartE2EDuration="2m12.652427397s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.651757549 +0000 UTC m=+151.755553929" watchObservedRunningTime="2025-12-05 15:57:44.652427397 +0000 UTC m=+151.756223777" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.653919 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" podStartSLOduration=131.653911348 podStartE2EDuration="2m11.653911348s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.578174787 +0000 UTC m=+151.681971167" watchObservedRunningTime="2025-12-05 15:57:44.653911348 +0000 UTC m=+151.757707728" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.683116 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:44 crc kubenswrapper[4778]: E1205 15:57:44.683413 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.183401713 +0000 UTC m=+152.287198093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.685728 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" event={"ID":"9093abb1-e6d3-48d8-972a-88c8f8ec9fe2","Type":"ContainerStarted","Data":"1ad13eadc4c415efdfa268bc8ae707633879190a4dd51586f5ee22c084aaa8b1"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.686518 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.717605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" event={"ID":"cca81879-8d18-4463-8db2-b17f8c24874c","Type":"ContainerStarted","Data":"ef819a3086d317191b35b96327713d81bff044b981d83b284d40f8a2544fe487"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.717667 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" event={"ID":"cca81879-8d18-4463-8db2-b17f8c24874c","Type":"ContainerStarted","Data":"193605c193a25726ad8b046ddbcdec93ba224fc20851f47fa0d7680076061f38"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.736745 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" event={"ID":"6a0bc7a3-053e-495b-ac9f-21322dace59d","Type":"ContainerStarted","Data":"acf5e4335382e9a9b3a1dc0e1a0c47c6dbe9e43628e3663db10c8be531c8fddd"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.737563 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.749436 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" podStartSLOduration=132.749420328 podStartE2EDuration="2m12.749420328s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.714158924 +0000 UTC m=+151.817955304" watchObservedRunningTime="2025-12-05 15:57:44.749420328 +0000 UTC m=+151.853216708" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.750551 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-jfv5q" podStartSLOduration=132.750547709 podStartE2EDuration="2m12.750547709s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.748725239 +0000 UTC m=+151.852521619" watchObservedRunningTime="2025-12-05 15:57:44.750547709 +0000 UTC m=+151.854344089" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.751416 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.780061 4778 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8xvnf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.780124 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" podUID="6a0bc7a3-053e-495b-ac9f-21322dace59d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.783848 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:44 crc kubenswrapper[4778]: E1205 15:57:44.784045 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.284007153 +0000 UTC m=+152.387803543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:44 crc kubenswrapper[4778]: E1205 15:57:44.787029 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.287009765 +0000 UTC m=+152.390806135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.784354 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.803564 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" event={"ID":"5bcd665d-3ce2-437e-abcd-43175c1395c8","Type":"ContainerStarted","Data":"783349edfe06413bce56e3f717018667eec9d4a8c626a21c395ade4c6de73736"} Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.808640 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qdnqc" podStartSLOduration=7.808614756 podStartE2EDuration="7.808614756s" podCreationTimestamp="2025-12-05 15:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.788891167 +0000 UTC m=+151.892687547" watchObservedRunningTime="2025-12-05 15:57:44.808614756 +0000 UTC m=+151.912411136" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.822728 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.858120 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mw82h" podStartSLOduration=132.858100849 podStartE2EDuration="2m12.858100849s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.826227837 +0000 UTC m=+151.930024207" watchObservedRunningTime="2025-12-05 15:57:44.858100849 +0000 UTC m=+151.961897219" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.875570 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ndj72" podStartSLOduration=132.875551836 podStartE2EDuration="2m12.875551836s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.874866297 +0000 UTC m=+151.978662677" watchObservedRunningTime="2025-12-05 15:57:44.875551836 +0000 UTC m=+151.979348216" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.876279 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lpqtr" podStartSLOduration=131.876274195 podStartE2EDuration="2m11.876274195s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.858322395 +0000 UTC m=+151.962118775" watchObservedRunningTime="2025-12-05 15:57:44.876274195 +0000 UTC m=+151.980070575" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.891497 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:44 crc kubenswrapper[4778]: E1205 15:57:44.892770 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.392748886 +0000 UTC m=+152.496545266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.949870 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" podStartSLOduration=131.949840426 podStartE2EDuration="2m11.949840426s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.907242512 +0000 UTC m=+152.011038912" watchObservedRunningTime="2025-12-05 15:57:44.949840426 +0000 UTC m=+152.053636806" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.980603 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" podStartSLOduration=131.980588897 podStartE2EDuration="2m11.980588897s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.953485926 +0000 UTC m=+152.057282306" watchObservedRunningTime="2025-12-05 15:57:44.980588897 +0000 UTC m=+152.084385277" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.980916 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" podStartSLOduration=132.980912595 podStartE2EDuration="2m12.980912595s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:44.980021401 +0000 UTC m=+152.083817771" watchObservedRunningTime="2025-12-05 15:57:44.980912595 +0000 UTC m=+152.084708975" Dec 05 15:57:44 crc kubenswrapper[4778]: I1205 15:57:44.995525 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.004097 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.504081269 +0000 UTC m=+152.607877649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.054795 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:45 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:45 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:45 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.055256 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.059489 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" podStartSLOduration=132.059478194 podStartE2EDuration="2m12.059478194s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:45.058686792 +0000 UTC m=+152.162483172" watchObservedRunningTime="2025-12-05 15:57:45.059478194 +0000 UTC m=+152.163274574" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.060263 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-f26tf" podStartSLOduration=133.060256795 podStartE2EDuration="2m13.060256795s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:45.03339633 +0000 UTC m=+152.137192710" watchObservedRunningTime="2025-12-05 15:57:45.060256795 +0000 UTC m=+152.164053175" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.097818 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.098099 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.598073668 +0000 UTC m=+152.701870048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.098702 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.099010 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.598999564 +0000 UTC m=+152.702795944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.132633 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-jfv5q" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.200939 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.201049 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.701033923 +0000 UTC m=+152.804830303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.201216 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.201515 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.701507345 +0000 UTC m=+152.805303725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.211322 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wkk25"] Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.212203 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.218736 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.233635 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wkk25"] Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.303901 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.304129 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.80407665 +0000 UTC m=+152.907873030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.304291 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cff4721-de89-49fa-9f19-682ec8ae8e64-catalog-content\") pod \"community-operators-wkk25\" (UID: \"8cff4721-de89-49fa-9f19-682ec8ae8e64\") " pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.304409 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.304497 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knrld\" (UniqueName: \"kubernetes.io/projected/8cff4721-de89-49fa-9f19-682ec8ae8e64-kube-api-access-knrld\") pod \"community-operators-wkk25\" (UID: \"8cff4721-de89-49fa-9f19-682ec8ae8e64\") " pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.304602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cff4721-de89-49fa-9f19-682ec8ae8e64-utilities\") pod \"community-operators-wkk25\" (UID: \"8cff4721-de89-49fa-9f19-682ec8ae8e64\") " pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.304766 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.804757388 +0000 UTC m=+152.908553768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.374427 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-plzrl"] Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.375315 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.393132 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.396717 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-plzrl"] Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.406184 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.406292 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.906273583 +0000 UTC m=+153.010069953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.406572 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cff4721-de89-49fa-9f19-682ec8ae8e64-catalog-content\") pod \"community-operators-wkk25\" (UID: \"8cff4721-de89-49fa-9f19-682ec8ae8e64\") " pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.406604 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.406629 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knrld\" (UniqueName: \"kubernetes.io/projected/8cff4721-de89-49fa-9f19-682ec8ae8e64-kube-api-access-knrld\") pod \"community-operators-wkk25\" (UID: \"8cff4721-de89-49fa-9f19-682ec8ae8e64\") " pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.407219 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:45.907211659 +0000 UTC m=+153.011008029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.407337 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cff4721-de89-49fa-9f19-682ec8ae8e64-catalog-content\") pod \"community-operators-wkk25\" (UID: \"8cff4721-de89-49fa-9f19-682ec8ae8e64\") " pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.407562 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cff4721-de89-49fa-9f19-682ec8ae8e64-utilities\") pod \"community-operators-wkk25\" (UID: \"8cff4721-de89-49fa-9f19-682ec8ae8e64\") " pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.407909 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cff4721-de89-49fa-9f19-682ec8ae8e64-utilities\") pod \"community-operators-wkk25\" (UID: \"8cff4721-de89-49fa-9f19-682ec8ae8e64\") " pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.462898 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knrld\" (UniqueName: \"kubernetes.io/projected/8cff4721-de89-49fa-9f19-682ec8ae8e64-kube-api-access-knrld\") pod \"community-operators-wkk25\" (UID: \"8cff4721-de89-49fa-9f19-682ec8ae8e64\") " pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.509874 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.510080 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5267d1d-ec1f-461d-acdc-57303aac7015-catalog-content\") pod \"certified-operators-plzrl\" (UID: \"e5267d1d-ec1f-461d-acdc-57303aac7015\") " pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.510156 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xvk7\" (UniqueName: \"kubernetes.io/projected/e5267d1d-ec1f-461d-acdc-57303aac7015-kube-api-access-2xvk7\") pod \"certified-operators-plzrl\" (UID: \"e5267d1d-ec1f-461d-acdc-57303aac7015\") " pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.510183 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5267d1d-ec1f-461d-acdc-57303aac7015-utilities\") pod \"certified-operators-plzrl\" (UID: \"e5267d1d-ec1f-461d-acdc-57303aac7015\") " pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.510313 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:46.010295746 +0000 UTC m=+153.114092126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.534417 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.612010 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7jmzd"] Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.612698 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xvk7\" (UniqueName: \"kubernetes.io/projected/e5267d1d-ec1f-461d-acdc-57303aac7015-kube-api-access-2xvk7\") pod \"certified-operators-plzrl\" (UID: \"e5267d1d-ec1f-461d-acdc-57303aac7015\") " pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.612745 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5267d1d-ec1f-461d-acdc-57303aac7015-utilities\") pod \"certified-operators-plzrl\" (UID: \"e5267d1d-ec1f-461d-acdc-57303aac7015\") " pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.612776 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.612804 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5267d1d-ec1f-461d-acdc-57303aac7015-catalog-content\") pod \"certified-operators-plzrl\" (UID: \"e5267d1d-ec1f-461d-acdc-57303aac7015\") " pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.612919 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.613244 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5267d1d-ec1f-461d-acdc-57303aac7015-catalog-content\") pod \"certified-operators-plzrl\" (UID: \"e5267d1d-ec1f-461d-acdc-57303aac7015\") " pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.613621 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:46.11360915 +0000 UTC m=+153.217405530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.613702 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5267d1d-ec1f-461d-acdc-57303aac7015-utilities\") pod \"certified-operators-plzrl\" (UID: \"e5267d1d-ec1f-461d-acdc-57303aac7015\") " pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.644045 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xvk7\" (UniqueName: \"kubernetes.io/projected/e5267d1d-ec1f-461d-acdc-57303aac7015-kube-api-access-2xvk7\") pod \"certified-operators-plzrl\" (UID: \"e5267d1d-ec1f-461d-acdc-57303aac7015\") " pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.647577 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7jmzd"] Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.688663 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.714059 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.714281 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-catalog-content\") pod \"community-operators-7jmzd\" (UID: \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\") " pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.714342 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzhfp\" (UniqueName: \"kubernetes.io/projected/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-kube-api-access-lzhfp\") pod \"community-operators-7jmzd\" (UID: \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\") " pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.714431 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-utilities\") pod \"community-operators-7jmzd\" (UID: \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\") " pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.715741 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:46.215714642 +0000 UTC m=+153.319511022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.776850 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-77q8g"] Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.777778 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.815035 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77q8g"] Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.815250 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzhfp\" (UniqueName: \"kubernetes.io/projected/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-kube-api-access-lzhfp\") pod \"community-operators-7jmzd\" (UID: \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\") " pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.815437 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.815646 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-utilities\") pod \"community-operators-7jmzd\" (UID: \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\") " pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.815775 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-catalog-content\") pod \"community-operators-7jmzd\" (UID: \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\") " pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.815809 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:46.315788847 +0000 UTC m=+153.419585267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.816322 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-utilities\") pod \"community-operators-7jmzd\" (UID: \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\") " pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.816452 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-catalog-content\") pod \"community-operators-7jmzd\" (UID: \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\") " pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.839188 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzhfp\" (UniqueName: \"kubernetes.io/projected/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-kube-api-access-lzhfp\") pod \"community-operators-7jmzd\" (UID: \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\") " pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.918334 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nf9v5" event={"ID":"db6e2e26-e34f-46e8-a5fe-b25a12930d39","Type":"ContainerStarted","Data":"d157e9589e6fc64aee6762c4ba156401dda350047b522ba27331e44d2dde0625"} Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.918777 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nf9v5" event={"ID":"db6e2e26-e34f-46e8-a5fe-b25a12930d39","Type":"ContainerStarted","Data":"d778d0fac35069ead7687061ad518f468eeedf8b2bc98d01441978ac0f982bd3"} Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.919456 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.919822 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.919989 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:46.419976155 +0000 UTC m=+153.523772535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.920053 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd46l\" (UniqueName: \"kubernetes.io/projected/31eaa34f-a155-495d-a833-a54ba9546a1a-kube-api-access-zd46l\") pod \"certified-operators-77q8g\" (UID: \"31eaa34f-a155-495d-a833-a54ba9546a1a\") " pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.920089 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.920114 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31eaa34f-a155-495d-a833-a54ba9546a1a-catalog-content\") pod \"certified-operators-77q8g\" (UID: \"31eaa34f-a155-495d-a833-a54ba9546a1a\") " pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.920159 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31eaa34f-a155-495d-a833-a54ba9546a1a-utilities\") pod \"certified-operators-77q8g\" (UID: \"31eaa34f-a155-495d-a833-a54ba9546a1a\") " pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:57:45 crc kubenswrapper[4778]: E1205 15:57:45.920434 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:46.420427867 +0000 UTC m=+153.524224247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.929687 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.937053 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" event={"ID":"01fba0d7-2372-4976-8ccc-4a4b15ff2fb5","Type":"ContainerStarted","Data":"be79fd9bc8bfa2bf506b04251af4585fed97aaf68445f306626242111c4eba95"} Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.952683 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nf9v5" podStartSLOduration=8.952665478 podStartE2EDuration="8.952665478s" podCreationTimestamp="2025-12-05 15:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:45.950944052 +0000 UTC m=+153.054740432" watchObservedRunningTime="2025-12-05 15:57:45.952665478 +0000 UTC m=+153.056461858" Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.962856 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" event={"ID":"05baa2fe-0ea7-41b3-9e70-04412e0e5658","Type":"ContainerStarted","Data":"a029b764f46d5c2cf6eaa52b4e9c32999f5ae745a442644bab4c5185f25d5ac0"} Dec 05 15:57:45 crc kubenswrapper[4778]: I1205 15:57:45.975328 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" podStartSLOduration=132.975308927 podStartE2EDuration="2m12.975308927s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:45.972101249 +0000 UTC m=+153.075897629" watchObservedRunningTime="2025-12-05 15:57:45.975308927 +0000 UTC m=+153.079105307" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.010234 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" event={"ID":"cfbfb374-e786-4e49-8c80-54ec12c7abcc","Type":"ContainerStarted","Data":"691718629a46b71be1f44b8419e0515c2244285902f996b411a5a39abe99d190"} Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.010287 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" event={"ID":"cfbfb374-e786-4e49-8c80-54ec12c7abcc","Type":"ContainerStarted","Data":"e873c673685945b998b154bc237b2b962b3fa511977647a77f07f1cb969c0f64"} Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.013996 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" event={"ID":"9093abb1-e6d3-48d8-972a-88c8f8ec9fe2","Type":"ContainerStarted","Data":"a476d5afe3bbb242aebd7b73f13634098c2ef4e6bf0318df37074a12319863c9"} Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.019746 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t72kw" event={"ID":"e9ff9de8-25f7-408a-834b-ec4edf3c98e5","Type":"ContainerStarted","Data":"d439052547986cdb5be8ac0d0b6a82f4dfe0833a5c3a9e549563bdcba910efaf"} Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.031667 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.032530 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd46l\" (UniqueName: \"kubernetes.io/projected/31eaa34f-a155-495d-a833-a54ba9546a1a-kube-api-access-zd46l\") pod \"certified-operators-77q8g\" (UID: \"31eaa34f-a155-495d-a833-a54ba9546a1a\") " pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.032811 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31eaa34f-a155-495d-a833-a54ba9546a1a-catalog-content\") pod \"certified-operators-77q8g\" (UID: \"31eaa34f-a155-495d-a833-a54ba9546a1a\") " pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.033029 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31eaa34f-a155-495d-a833-a54ba9546a1a-utilities\") pod \"certified-operators-77q8g\" (UID: \"31eaa34f-a155-495d-a833-a54ba9546a1a\") " pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.033519 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rbbgx" event={"ID":"cca81879-8d18-4463-8db2-b17f8c24874c","Type":"ContainerStarted","Data":"d9b939846debb680f8b88d1becaedcee32c50f0239f3a4bc27227f1f41b3a0ba"} Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.033920 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31eaa34f-a155-495d-a833-a54ba9546a1a-utilities\") pod \"certified-operators-77q8g\" (UID: \"31eaa34f-a155-495d-a833-a54ba9546a1a\") " pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.036129 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nfqps" podStartSLOduration=134.036110709 podStartE2EDuration="2m14.036110709s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:46.035809771 +0000 UTC m=+153.139606151" watchObservedRunningTime="2025-12-05 15:57:46.036110709 +0000 UTC m=+153.139907089" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.036818 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31eaa34f-a155-495d-a833-a54ba9546a1a-catalog-content\") pod \"certified-operators-77q8g\" (UID: \"31eaa34f-a155-495d-a833-a54ba9546a1a\") " pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:57:46 crc kubenswrapper[4778]: E1205 15:57:46.038058 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:46.538028012 +0000 UTC m=+153.641824392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.039003 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bctw8" event={"ID":"4fdf5c65-8d99-48be-9033-c8df3a8afdea","Type":"ContainerStarted","Data":"4f3ba15f904f71612ee8f1dff46cf0d5142a13352e5d98b28d3970ded46fd380"} Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.055174 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" event={"ID":"3d21106f-1d62-4e2f-98ac-5411f66d8352","Type":"ContainerStarted","Data":"03643514342c4676f35897e84fe1d76b0be2da2ba8bb4d5c7e468ebe4facb0fb"} Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.066544 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:46 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:46 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:46 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.066606 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.068099 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd46l\" (UniqueName: \"kubernetes.io/projected/31eaa34f-a155-495d-a833-a54ba9546a1a-kube-api-access-zd46l\") pod \"certified-operators-77q8g\" (UID: \"31eaa34f-a155-495d-a833-a54ba9546a1a\") " pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.083346 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" event={"ID":"a0b2c74a-c566-4614-aa26-65f67ae6fc94","Type":"ContainerStarted","Data":"3b9993bf28bfc8dcbf8c842220782ca97afb6c74e9454e40d39249cc1f1fed5c"} Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.095025 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" podStartSLOduration=134.095010419 podStartE2EDuration="2m14.095010419s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:46.076847733 +0000 UTC m=+153.180644123" watchObservedRunningTime="2025-12-05 15:57:46.095010419 +0000 UTC m=+153.198806799" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.096347 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.114741 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ndj72" event={"ID":"e1235317-93be-4e63-b980-259191d32b82","Type":"ContainerStarted","Data":"4880466c6e9fcb8f4b6b3dac2b692b00fd138762e765e9b9b011d9391bd78b0e"} Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.137705 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w929v" podStartSLOduration=133.137685126 podStartE2EDuration="2m13.137685126s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:46.137439549 +0000 UTC m=+153.241235919" watchObservedRunningTime="2025-12-05 15:57:46.137685126 +0000 UTC m=+153.241481506" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.138343 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:46 crc kubenswrapper[4778]: E1205 15:57:46.139999 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:46.639985589 +0000 UTC m=+153.743781959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.145210 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-nnbzm" podStartSLOduration=133.145198751 podStartE2EDuration="2m13.145198751s" podCreationTimestamp="2025-12-05 15:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:46.095975576 +0000 UTC m=+153.199771956" watchObservedRunningTime="2025-12-05 15:57:46.145198751 +0000 UTC m=+153.248995131" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.160741 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mw82h" event={"ID":"11591472-458c-43dc-b51b-2b15987291a0","Type":"ContainerStarted","Data":"e1240ac4e2529007f92b21663050ab27b6a2e8f4a880d6498ced3064484eaff9"} Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.180786 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" event={"ID":"26dc74da-e40d-4974-be9b-4f25f1eb66e7","Type":"ContainerStarted","Data":"b100a63c37fdc99d6c4dac65fffa07e54409fce8a03288df3478a6097e8f112b"} Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.213132 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" event={"ID":"b8d6b1ed-75bd-4c5a-ab2d-294b7359301d","Type":"ContainerStarted","Data":"b1b73e7d9f0bbbb2b0a152034ecdde22a80d969c3929c23019cc7b5174f8ad35"} Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.223847 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xvnf" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.236251 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-plzrl"] Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.239797 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:46 crc kubenswrapper[4778]: E1205 15:57:46.240528 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:46.740514957 +0000 UTC m=+153.844311337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:46 crc kubenswrapper[4778]: W1205 15:57:46.247544 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5267d1d_ec1f_461d_acdc_57303aac7015.slice/crio-4a158d8a12d5ca86d604a7718eb447da323a4fd559b55c8120284263891525ce WatchSource:0}: Error finding container 4a158d8a12d5ca86d604a7718eb447da323a4fd559b55c8120284263891525ce: Status 404 returned error can't find the container with id 4a158d8a12d5ca86d604a7718eb447da323a4fd559b55c8120284263891525ce Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.269663 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wkk25"] Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.341329 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:46 crc kubenswrapper[4778]: E1205 15:57:46.343308 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:46.843297157 +0000 UTC m=+153.947093537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.368300 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7jmzd"] Dec 05 15:57:46 crc kubenswrapper[4778]: W1205 15:57:46.398143 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod840cfa6b_b6e9_4eb0_a1ca_05d34003f9fa.slice/crio-5c1755539a4ebf1abef2b256aa9c17a45ace27074ec2b15ba1afb202c10429d2 WatchSource:0}: Error finding container 5c1755539a4ebf1abef2b256aa9c17a45ace27074ec2b15ba1afb202c10429d2: Status 404 returned error can't find the container with id 5c1755539a4ebf1abef2b256aa9c17a45ace27074ec2b15ba1afb202c10429d2 Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.442725 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:46 crc kubenswrapper[4778]: E1205 15:57:46.443096 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:46.943076504 +0000 UTC m=+154.046872884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.466869 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77q8g"] Dec 05 15:57:46 crc kubenswrapper[4778]: W1205 15:57:46.530100 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31eaa34f_a155_495d_a833_a54ba9546a1a.slice/crio-460b27a196277ee289c272fca8c37b495e919e6db4ef5f31d1e0b8794c75d44f WatchSource:0}: Error finding container 460b27a196277ee289c272fca8c37b495e919e6db4ef5f31d1e0b8794c75d44f: Status 404 returned error can't find the container with id 460b27a196277ee289c272fca8c37b495e919e6db4ef5f31d1e0b8794c75d44f Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.543943 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:46 crc kubenswrapper[4778]: E1205 15:57:46.544242 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.044230699 +0000 UTC m=+154.148027079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.645023 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:46 crc kubenswrapper[4778]: E1205 15:57:46.645384 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.145335083 +0000 UTC m=+154.249131463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.645572 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:46 crc kubenswrapper[4778]: E1205 15:57:46.645878 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.145866257 +0000 UTC m=+154.249662637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.664008 4778 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.685802 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zhb7w" Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.746956 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:46 crc kubenswrapper[4778]: E1205 15:57:46.747246 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.247231668 +0000 UTC m=+154.351028048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.848157 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:46 crc kubenswrapper[4778]: E1205 15:57:46.848536 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.348519017 +0000 UTC m=+154.452315397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.949192 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:46 crc kubenswrapper[4778]: E1205 15:57:46.949431 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.449405394 +0000 UTC m=+154.553201774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:46 crc kubenswrapper[4778]: I1205 15:57:46.949539 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:46 crc kubenswrapper[4778]: E1205 15:57:46.949848 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.449836926 +0000 UTC m=+154.553633306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.050610 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:47 crc kubenswrapper[4778]: E1205 15:57:47.050924 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.550896438 +0000 UTC m=+154.654692828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.051056 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:47 crc kubenswrapper[4778]: E1205 15:57:47.051484 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.551467045 +0000 UTC m=+154.655263445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.053297 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:47 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:47 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:47 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.053346 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.152108 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:47 crc kubenswrapper[4778]: E1205 15:57:47.152496 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.652458635 +0000 UTC m=+154.756255055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.163685 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gwxnr"] Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.165481 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.167103 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.174815 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwxnr"] Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.219758 4778 generic.go:334] "Generic (PLEG): container finished" podID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" containerID="bb145dd12b0ccfece3846efa5388f9eac39608fdeb5ffc38eace30e4ddd73ed9" exitCode=0 Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.220000 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jmzd" event={"ID":"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa","Type":"ContainerDied","Data":"bb145dd12b0ccfece3846efa5388f9eac39608fdeb5ffc38eace30e4ddd73ed9"} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.220052 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jmzd" event={"ID":"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa","Type":"ContainerStarted","Data":"5c1755539a4ebf1abef2b256aa9c17a45ace27074ec2b15ba1afb202c10429d2"} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.221264 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.222126 4778 generic.go:334] "Generic (PLEG): container finished" podID="31eaa34f-a155-495d-a833-a54ba9546a1a" containerID="e599e057b68a85b36e8fea7866e3ee7a41e9cb941e746329fc97003e261beaae" exitCode=0 Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.222228 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77q8g" event={"ID":"31eaa34f-a155-495d-a833-a54ba9546a1a","Type":"ContainerDied","Data":"e599e057b68a85b36e8fea7866e3ee7a41e9cb941e746329fc97003e261beaae"} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.222265 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77q8g" event={"ID":"31eaa34f-a155-495d-a833-a54ba9546a1a","Type":"ContainerStarted","Data":"460b27a196277ee289c272fca8c37b495e919e6db4ef5f31d1e0b8794c75d44f"} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.224099 4778 generic.go:334] "Generic (PLEG): container finished" podID="e5267d1d-ec1f-461d-acdc-57303aac7015" containerID="6147c8dea248235058419b3d71d2b4870d90790749a9eca528192dad8b32c7e2" exitCode=0 Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.224165 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plzrl" event={"ID":"e5267d1d-ec1f-461d-acdc-57303aac7015","Type":"ContainerDied","Data":"6147c8dea248235058419b3d71d2b4870d90790749a9eca528192dad8b32c7e2"} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.224201 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plzrl" event={"ID":"e5267d1d-ec1f-461d-acdc-57303aac7015","Type":"ContainerStarted","Data":"4a158d8a12d5ca86d604a7718eb447da323a4fd559b55c8120284263891525ce"} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.226032 4778 generic.go:334] "Generic (PLEG): container finished" podID="00bbab66-fa02-4505-8afb-d9d9c1370d95" containerID="5ed5f18a843b48f50604e98b811e271036ea4e3626b486832ce9e95c37f718de" exitCode=0 Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.226089 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" event={"ID":"00bbab66-fa02-4505-8afb-d9d9c1370d95","Type":"ContainerDied","Data":"5ed5f18a843b48f50604e98b811e271036ea4e3626b486832ce9e95c37f718de"} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.227131 4778 generic.go:334] "Generic (PLEG): container finished" podID="8cff4721-de89-49fa-9f19-682ec8ae8e64" containerID="8c01060dabd996d45f7ca5581b1dc952b9bfb3fecd2065f0017193b2247f3afa" exitCode=0 Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.227168 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk25" event={"ID":"8cff4721-de89-49fa-9f19-682ec8ae8e64","Type":"ContainerDied","Data":"8c01060dabd996d45f7ca5581b1dc952b9bfb3fecd2065f0017193b2247f3afa"} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.227184 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk25" event={"ID":"8cff4721-de89-49fa-9f19-682ec8ae8e64","Type":"ContainerStarted","Data":"6d0911295a9a1b3eafc0d08c9a9954d0fa4d3f24bcf28af09a1b4ea434a4ec29"} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.234416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bctw8" event={"ID":"4fdf5c65-8d99-48be-9033-c8df3a8afdea","Type":"ContainerStarted","Data":"524336bd5cbd5c2cf43a74b6ca29864cb016972647992a62b55d0dac69440a12"} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.234459 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bctw8" event={"ID":"4fdf5c65-8d99-48be-9033-c8df3a8afdea","Type":"ContainerStarted","Data":"3f5be0853bac75d58a9c32c195157cece04b3b3122c8c76fbab99783fb469a16"} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.234471 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bctw8" event={"ID":"4fdf5c65-8d99-48be-9033-c8df3a8afdea","Type":"ContainerStarted","Data":"84be18bda1b03a36d4cdac8677df9caa7045b3ea0436aaa84847a4fe9b290cf4"} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.244831 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5bl2q" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.254689 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:47 crc kubenswrapper[4778]: E1205 15:57:47.254953 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.754942006 +0000 UTC m=+154.858738376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.308571 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bctw8" podStartSLOduration=10.308556051 podStartE2EDuration="10.308556051s" podCreationTimestamp="2025-12-05 15:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:47.304396428 +0000 UTC m=+154.408192808" watchObservedRunningTime="2025-12-05 15:57:47.308556051 +0000 UTC m=+154.412352431" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.356208 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.356442 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c6979b-453d-49a4-889e-e46eff9af778-utilities\") pod \"redhat-marketplace-gwxnr\" (UID: \"70c6979b-453d-49a4-889e-e46eff9af778\") " pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:57:47 crc kubenswrapper[4778]: E1205 15:57:47.357231 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.857216062 +0000 UTC m=+154.961012442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.359081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwm9t\" (UniqueName: \"kubernetes.io/projected/70c6979b-453d-49a4-889e-e46eff9af778-kube-api-access-qwm9t\") pod \"redhat-marketplace-gwxnr\" (UID: \"70c6979b-453d-49a4-889e-e46eff9af778\") " pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.359305 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.359463 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c6979b-453d-49a4-889e-e46eff9af778-catalog-content\") pod \"redhat-marketplace-gwxnr\" (UID: \"70c6979b-453d-49a4-889e-e46eff9af778\") " pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:57:47 crc kubenswrapper[4778]: E1205 15:57:47.362210 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 15:57:47.862190808 +0000 UTC m=+154.965987278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m6452" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.444731 4778 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-05T15:57:46.664032823Z","Handler":null,"Name":""} Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.449842 4778 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.450051 4778 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.460677 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.461173 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwm9t\" (UniqueName: \"kubernetes.io/projected/70c6979b-453d-49a4-889e-e46eff9af778-kube-api-access-qwm9t\") pod \"redhat-marketplace-gwxnr\" (UID: \"70c6979b-453d-49a4-889e-e46eff9af778\") " pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.461356 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c6979b-453d-49a4-889e-e46eff9af778-catalog-content\") pod \"redhat-marketplace-gwxnr\" (UID: \"70c6979b-453d-49a4-889e-e46eff9af778\") " pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.461564 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c6979b-453d-49a4-889e-e46eff9af778-utilities\") pod \"redhat-marketplace-gwxnr\" (UID: \"70c6979b-453d-49a4-889e-e46eff9af778\") " pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.461887 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c6979b-453d-49a4-889e-e46eff9af778-catalog-content\") pod \"redhat-marketplace-gwxnr\" (UID: \"70c6979b-453d-49a4-889e-e46eff9af778\") " pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.462131 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c6979b-453d-49a4-889e-e46eff9af778-utilities\") pod \"redhat-marketplace-gwxnr\" (UID: \"70c6979b-453d-49a4-889e-e46eff9af778\") " pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.475317 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.482874 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwm9t\" (UniqueName: \"kubernetes.io/projected/70c6979b-453d-49a4-889e-e46eff9af778-kube-api-access-qwm9t\") pod \"redhat-marketplace-gwxnr\" (UID: \"70c6979b-453d-49a4-889e-e46eff9af778\") " pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.491181 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.563455 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.575214 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fj8nh"] Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.576482 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.591040 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj8nh"] Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.596393 4778 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.596440 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.628053 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m6452\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.687682 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwxnr"] Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.689089 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:47 crc kubenswrapper[4778]: W1205 15:57:47.694659 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70c6979b_453d_49a4_889e_e46eff9af778.slice/crio-f6cd4ce76475c37356ae9abd3e7ce63b2fb7a966480f805a2e7e92083fc16c7c WatchSource:0}: Error finding container f6cd4ce76475c37356ae9abd3e7ce63b2fb7a966480f805a2e7e92083fc16c7c: Status 404 returned error can't find the container with id f6cd4ce76475c37356ae9abd3e7ce63b2fb7a966480f805a2e7e92083fc16c7c Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.765943 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg6q8\" (UniqueName: \"kubernetes.io/projected/4047fee6-0560-4d41-8212-aa284021dff0-kube-api-access-wg6q8\") pod \"redhat-marketplace-fj8nh\" (UID: \"4047fee6-0560-4d41-8212-aa284021dff0\") " pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.766006 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4047fee6-0560-4d41-8212-aa284021dff0-catalog-content\") pod \"redhat-marketplace-fj8nh\" (UID: \"4047fee6-0560-4d41-8212-aa284021dff0\") " pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.766086 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4047fee6-0560-4d41-8212-aa284021dff0-utilities\") pod \"redhat-marketplace-fj8nh\" (UID: \"4047fee6-0560-4d41-8212-aa284021dff0\") " pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.866882 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg6q8\" (UniqueName: \"kubernetes.io/projected/4047fee6-0560-4d41-8212-aa284021dff0-kube-api-access-wg6q8\") pod \"redhat-marketplace-fj8nh\" (UID: \"4047fee6-0560-4d41-8212-aa284021dff0\") " pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.867198 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4047fee6-0560-4d41-8212-aa284021dff0-catalog-content\") pod \"redhat-marketplace-fj8nh\" (UID: \"4047fee6-0560-4d41-8212-aa284021dff0\") " pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.867236 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4047fee6-0560-4d41-8212-aa284021dff0-utilities\") pod \"redhat-marketplace-fj8nh\" (UID: \"4047fee6-0560-4d41-8212-aa284021dff0\") " pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.867754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4047fee6-0560-4d41-8212-aa284021dff0-utilities\") pod \"redhat-marketplace-fj8nh\" (UID: \"4047fee6-0560-4d41-8212-aa284021dff0\") " pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.867785 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4047fee6-0560-4d41-8212-aa284021dff0-catalog-content\") pod \"redhat-marketplace-fj8nh\" (UID: \"4047fee6-0560-4d41-8212-aa284021dff0\") " pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.891770 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg6q8\" (UniqueName: \"kubernetes.io/projected/4047fee6-0560-4d41-8212-aa284021dff0-kube-api-access-wg6q8\") pod \"redhat-marketplace-fj8nh\" (UID: \"4047fee6-0560-4d41-8212-aa284021dff0\") " pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.907055 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m6452"] Dec 05 15:57:47 crc kubenswrapper[4778]: I1205 15:57:47.931034 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.050077 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:48 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:48 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:48 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.050140 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.218410 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj8nh"] Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.246254 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj8nh" event={"ID":"4047fee6-0560-4d41-8212-aa284021dff0","Type":"ContainerStarted","Data":"3855611cbe647602b6a6c843489ce64819971bae0b2bff928275406ae66dde26"} Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.249206 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" event={"ID":"7d0c5c29-5367-41e6-be46-e23a9ac5e281","Type":"ContainerStarted","Data":"01ce86f1a69afcbf4f879cf61543dd7e1ed6b1664ee2e3a6a6462e786b47d2fd"} Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.249230 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" event={"ID":"7d0c5c29-5367-41e6-be46-e23a9ac5e281","Type":"ContainerStarted","Data":"0dc874c0973c58e30d428e8515dbcc5cd901ec22c61825fd87fda006317a1557"} Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.249698 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.253318 4778 generic.go:334] "Generic (PLEG): container finished" podID="70c6979b-453d-49a4-889e-e46eff9af778" containerID="005c15e3d7a26871af424c95767ca68ff3cfeab4c2b4971d856fbb5dcfa4796b" exitCode=0 Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.253817 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwxnr" event={"ID":"70c6979b-453d-49a4-889e-e46eff9af778","Type":"ContainerDied","Data":"005c15e3d7a26871af424c95767ca68ff3cfeab4c2b4971d856fbb5dcfa4796b"} Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.253835 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwxnr" event={"ID":"70c6979b-453d-49a4-889e-e46eff9af778","Type":"ContainerStarted","Data":"f6cd4ce76475c37356ae9abd3e7ce63b2fb7a966480f805a2e7e92083fc16c7c"} Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.320044 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" podStartSLOduration=136.316534144 podStartE2EDuration="2m16.316534144s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:48.273226311 +0000 UTC m=+155.377022711" watchObservedRunningTime="2025-12-05 15:57:48.316534144 +0000 UTC m=+155.420330524" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.370212 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tlnh4"] Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.371448 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.379573 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.392787 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tlnh4"] Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.483595 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2fp9\" (UniqueName: \"kubernetes.io/projected/eb1076bb-639d-42e5-ab8c-d13eb121cc95-kube-api-access-f2fp9\") pod \"redhat-operators-tlnh4\" (UID: \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\") " pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.483652 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb1076bb-639d-42e5-ab8c-d13eb121cc95-utilities\") pod \"redhat-operators-tlnh4\" (UID: \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\") " pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.483679 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb1076bb-639d-42e5-ab8c-d13eb121cc95-catalog-content\") pod \"redhat-operators-tlnh4\" (UID: \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\") " pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.583590 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.584796 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2fp9\" (UniqueName: \"kubernetes.io/projected/eb1076bb-639d-42e5-ab8c-d13eb121cc95-kube-api-access-f2fp9\") pod \"redhat-operators-tlnh4\" (UID: \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\") " pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.584882 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb1076bb-639d-42e5-ab8c-d13eb121cc95-utilities\") pod \"redhat-operators-tlnh4\" (UID: \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\") " pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.584935 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb1076bb-639d-42e5-ab8c-d13eb121cc95-catalog-content\") pod \"redhat-operators-tlnh4\" (UID: \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\") " pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.585537 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb1076bb-639d-42e5-ab8c-d13eb121cc95-catalog-content\") pod \"redhat-operators-tlnh4\" (UID: \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\") " pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.585560 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb1076bb-639d-42e5-ab8c-d13eb121cc95-utilities\") pod \"redhat-operators-tlnh4\" (UID: \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\") " pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.611210 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2fp9\" (UniqueName: \"kubernetes.io/projected/eb1076bb-639d-42e5-ab8c-d13eb121cc95-kube-api-access-f2fp9\") pod \"redhat-operators-tlnh4\" (UID: \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\") " pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.686203 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00bbab66-fa02-4505-8afb-d9d9c1370d95-secret-volume\") pod \"00bbab66-fa02-4505-8afb-d9d9c1370d95\" (UID: \"00bbab66-fa02-4505-8afb-d9d9c1370d95\") " Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.686342 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgq2b\" (UniqueName: \"kubernetes.io/projected/00bbab66-fa02-4505-8afb-d9d9c1370d95-kube-api-access-qgq2b\") pod \"00bbab66-fa02-4505-8afb-d9d9c1370d95\" (UID: \"00bbab66-fa02-4505-8afb-d9d9c1370d95\") " Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.686476 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00bbab66-fa02-4505-8afb-d9d9c1370d95-config-volume\") pod \"00bbab66-fa02-4505-8afb-d9d9c1370d95\" (UID: \"00bbab66-fa02-4505-8afb-d9d9c1370d95\") " Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.686989 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00bbab66-fa02-4505-8afb-d9d9c1370d95-config-volume" (OuterVolumeSpecName: "config-volume") pod "00bbab66-fa02-4505-8afb-d9d9c1370d95" (UID: "00bbab66-fa02-4505-8afb-d9d9c1370d95"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.690202 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bbab66-fa02-4505-8afb-d9d9c1370d95-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "00bbab66-fa02-4505-8afb-d9d9c1370d95" (UID: "00bbab66-fa02-4505-8afb-d9d9c1370d95"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.691718 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00bbab66-fa02-4505-8afb-d9d9c1370d95-kube-api-access-qgq2b" (OuterVolumeSpecName: "kube-api-access-qgq2b") pod "00bbab66-fa02-4505-8afb-d9d9c1370d95" (UID: "00bbab66-fa02-4505-8afb-d9d9c1370d95"). InnerVolumeSpecName "kube-api-access-qgq2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.710970 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.764039 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xr8gn"] Dec 05 15:57:48 crc kubenswrapper[4778]: E1205 15:57:48.764297 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bbab66-fa02-4505-8afb-d9d9c1370d95" containerName="collect-profiles" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.764312 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bbab66-fa02-4505-8afb-d9d9c1370d95" containerName="collect-profiles" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.764459 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="00bbab66-fa02-4505-8afb-d9d9c1370d95" containerName="collect-profiles" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.765317 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.773601 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xr8gn"] Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.791790 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/866ee44d-6331-47f7-8452-be6f25815a1e-utilities\") pod \"redhat-operators-xr8gn\" (UID: \"866ee44d-6331-47f7-8452-be6f25815a1e\") " pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.791826 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6xht\" (UniqueName: \"kubernetes.io/projected/866ee44d-6331-47f7-8452-be6f25815a1e-kube-api-access-x6xht\") pod \"redhat-operators-xr8gn\" (UID: \"866ee44d-6331-47f7-8452-be6f25815a1e\") " pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.791894 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/866ee44d-6331-47f7-8452-be6f25815a1e-catalog-content\") pod \"redhat-operators-xr8gn\" (UID: \"866ee44d-6331-47f7-8452-be6f25815a1e\") " pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.791965 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00bbab66-fa02-4505-8afb-d9d9c1370d95-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.791977 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgq2b\" (UniqueName: \"kubernetes.io/projected/00bbab66-fa02-4505-8afb-d9d9c1370d95-kube-api-access-qgq2b\") on node \"crc\" DevicePath \"\"" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.791986 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00bbab66-fa02-4505-8afb-d9d9c1370d95-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.893541 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/866ee44d-6331-47f7-8452-be6f25815a1e-catalog-content\") pod \"redhat-operators-xr8gn\" (UID: \"866ee44d-6331-47f7-8452-be6f25815a1e\") " pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.893611 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/866ee44d-6331-47f7-8452-be6f25815a1e-utilities\") pod \"redhat-operators-xr8gn\" (UID: \"866ee44d-6331-47f7-8452-be6f25815a1e\") " pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.893635 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6xht\" (UniqueName: \"kubernetes.io/projected/866ee44d-6331-47f7-8452-be6f25815a1e-kube-api-access-x6xht\") pod \"redhat-operators-xr8gn\" (UID: \"866ee44d-6331-47f7-8452-be6f25815a1e\") " pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.894061 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/866ee44d-6331-47f7-8452-be6f25815a1e-utilities\") pod \"redhat-operators-xr8gn\" (UID: \"866ee44d-6331-47f7-8452-be6f25815a1e\") " pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.901963 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/866ee44d-6331-47f7-8452-be6f25815a1e-catalog-content\") pod \"redhat-operators-xr8gn\" (UID: \"866ee44d-6331-47f7-8452-be6f25815a1e\") " pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:57:48 crc kubenswrapper[4778]: I1205 15:57:48.923149 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6xht\" (UniqueName: \"kubernetes.io/projected/866ee44d-6331-47f7-8452-be6f25815a1e-kube-api-access-x6xht\") pod \"redhat-operators-xr8gn\" (UID: \"866ee44d-6331-47f7-8452-be6f25815a1e\") " pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.051321 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:49 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:49 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:49 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.051896 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.057069 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tlnh4"] Dec 05 15:57:49 crc kubenswrapper[4778]: W1205 15:57:49.067114 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb1076bb_639d_42e5_ab8c_d13eb121cc95.slice/crio-31527b7d0afc73b65dd40049eee8fecd8f623a096d1bc3e1cad6cf72585dcf50 WatchSource:0}: Error finding container 31527b7d0afc73b65dd40049eee8fecd8f623a096d1bc3e1cad6cf72585dcf50: Status 404 returned error can't find the container with id 31527b7d0afc73b65dd40049eee8fecd8f623a096d1bc3e1cad6cf72585dcf50 Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.097766 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.269566 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.286198 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.287352 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd" event={"ID":"00bbab66-fa02-4505-8afb-d9d9c1370d95","Type":"ContainerDied","Data":"b78999547d192ae133f8e845ec7f6ecba7e387b7a7967d268451d46446fc683d"} Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.287889 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b78999547d192ae133f8e845ec7f6ecba7e387b7a7967d268451d46446fc683d" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.288003 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlnh4" event={"ID":"eb1076bb-639d-42e5-ab8c-d13eb121cc95","Type":"ContainerStarted","Data":"31527b7d0afc73b65dd40049eee8fecd8f623a096d1bc3e1cad6cf72585dcf50"} Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.291099 4778 generic.go:334] "Generic (PLEG): container finished" podID="4047fee6-0560-4d41-8212-aa284021dff0" containerID="3f317cb916b5a76c33c3c3f7528610da54934e45626536ec1a4c1232511fef21" exitCode=0 Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.291201 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj8nh" event={"ID":"4047fee6-0560-4d41-8212-aa284021dff0","Type":"ContainerDied","Data":"3f317cb916b5a76c33c3c3f7528610da54934e45626536ec1a4c1232511fef21"} Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.484978 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xr8gn"] Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.774793 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-smxt5" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.920256 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.920598 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.920609 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.921898 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.929407 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.942488 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.945482 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.946761 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.963533 4778 patch_prober.go:28] interesting pod/console-f9d7485db-gnnls container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Dec 05 15:57:49 crc kubenswrapper[4778]: I1205 15:57:49.963591 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-gnnls" podUID="0c134aff-5bc5-4901-8746-5f79fb395b01" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.046446 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.049665 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:50 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:50 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:50 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.049719 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.119047 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.119739 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.121727 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.121986 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.127434 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.215323 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2bf1e687-285f-496a-b75d-7f6cae5088f2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2bf1e687-285f-496a-b75d-7f6cae5088f2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.215520 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2bf1e687-285f-496a-b75d-7f6cae5088f2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2bf1e687-285f-496a-b75d-7f6cae5088f2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.300507 4778 generic.go:334] "Generic (PLEG): container finished" podID="866ee44d-6331-47f7-8452-be6f25815a1e" containerID="a9fa942803b6fb651824147607800240a3ed86de42fb24b84e3c2d159293a8c5" exitCode=0 Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.300608 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr8gn" event={"ID":"866ee44d-6331-47f7-8452-be6f25815a1e","Type":"ContainerDied","Data":"a9fa942803b6fb651824147607800240a3ed86de42fb24b84e3c2d159293a8c5"} Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.300660 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr8gn" event={"ID":"866ee44d-6331-47f7-8452-be6f25815a1e","Type":"ContainerStarted","Data":"0edf935d4e0b5ae7ecfb618d388746b5db541cfde2678792396bc4f885ad184f"} Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.302233 4778 generic.go:334] "Generic (PLEG): container finished" podID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" containerID="dda4b4412b2b2f999775b70a53aeb23388087148d3c0047f1f6d8c61a6aa3b79" exitCode=0 Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.302422 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlnh4" event={"ID":"eb1076bb-639d-42e5-ab8c-d13eb121cc95","Type":"ContainerDied","Data":"dda4b4412b2b2f999775b70a53aeb23388087148d3c0047f1f6d8c61a6aa3b79"} Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.308491 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cp4xg" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.309009 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ffqnf" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.320226 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2bf1e687-285f-496a-b75d-7f6cae5088f2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2bf1e687-285f-496a-b75d-7f6cae5088f2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.320402 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2bf1e687-285f-496a-b75d-7f6cae5088f2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2bf1e687-285f-496a-b75d-7f6cae5088f2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.320563 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2bf1e687-285f-496a-b75d-7f6cae5088f2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2bf1e687-285f-496a-b75d-7f6cae5088f2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.322078 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.327808 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.329941 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.337295 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.338611 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.435925 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2bf1e687-285f-496a-b75d-7f6cae5088f2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2bf1e687-285f-496a-b75d-7f6cae5088f2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.494617 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.529127 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc98d520-7ab5-4623-91d6-066f4390100f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cc98d520-7ab5-4623-91d6-066f4390100f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.529207 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc98d520-7ab5-4623-91d6-066f4390100f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cc98d520-7ab5-4623-91d6-066f4390100f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.630379 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc98d520-7ab5-4623-91d6-066f4390100f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cc98d520-7ab5-4623-91d6-066f4390100f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.630677 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc98d520-7ab5-4623-91d6-066f4390100f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cc98d520-7ab5-4623-91d6-066f4390100f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.630755 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc98d520-7ab5-4623-91d6-066f4390100f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cc98d520-7ab5-4623-91d6-066f4390100f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.648963 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc98d520-7ab5-4623-91d6-066f4390100f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cc98d520-7ab5-4623-91d6-066f4390100f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.735064 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 15:57:50 crc kubenswrapper[4778]: I1205 15:57:50.852246 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 15:57:50 crc kubenswrapper[4778]: W1205 15:57:50.879537 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2bf1e687_285f_496a_b75d_7f6cae5088f2.slice/crio-eefd46356de5c082a4217f5adaed6abeb235bab79720c2473b88ccf0990550cc WatchSource:0}: Error finding container eefd46356de5c082a4217f5adaed6abeb235bab79720c2473b88ccf0990550cc: Status 404 returned error can't find the container with id eefd46356de5c082a4217f5adaed6abeb235bab79720c2473b88ccf0990550cc Dec 05 15:57:51 crc kubenswrapper[4778]: I1205 15:57:51.026956 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 15:57:51 crc kubenswrapper[4778]: I1205 15:57:51.068283 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:51 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:51 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:51 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:51 crc kubenswrapper[4778]: I1205 15:57:51.068356 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:51 crc kubenswrapper[4778]: W1205 15:57:51.078761 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcc98d520_7ab5_4623_91d6_066f4390100f.slice/crio-5a9f9c5640728b7829b64ad34a8cd93373f89e89a11d867dc5ff40d8ea100690 WatchSource:0}: Error finding container 5a9f9c5640728b7829b64ad34a8cd93373f89e89a11d867dc5ff40d8ea100690: Status 404 returned error can't find the container with id 5a9f9c5640728b7829b64ad34a8cd93373f89e89a11d867dc5ff40d8ea100690 Dec 05 15:57:51 crc kubenswrapper[4778]: I1205 15:57:51.384071 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2bf1e687-285f-496a-b75d-7f6cae5088f2","Type":"ContainerStarted","Data":"eefd46356de5c082a4217f5adaed6abeb235bab79720c2473b88ccf0990550cc"} Dec 05 15:57:51 crc kubenswrapper[4778]: I1205 15:57:51.391443 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc98d520-7ab5-4623-91d6-066f4390100f","Type":"ContainerStarted","Data":"5a9f9c5640728b7829b64ad34a8cd93373f89e89a11d867dc5ff40d8ea100690"} Dec 05 15:57:52 crc kubenswrapper[4778]: I1205 15:57:52.049461 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:52 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:52 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:52 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:52 crc kubenswrapper[4778]: I1205 15:57:52.049550 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:53 crc kubenswrapper[4778]: I1205 15:57:53.049584 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:53 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:53 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:53 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:53 crc kubenswrapper[4778]: I1205 15:57:53.049848 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:53 crc kubenswrapper[4778]: I1205 15:57:53.407299 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc98d520-7ab5-4623-91d6-066f4390100f","Type":"ContainerStarted","Data":"22b311ef20d9f19b244f1b1f618c9b58fc01b9f53bd44d64c4cf9a98673151fd"} Dec 05 15:57:53 crc kubenswrapper[4778]: I1205 15:57:53.413217 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2bf1e687-285f-496a-b75d-7f6cae5088f2","Type":"ContainerStarted","Data":"48e6f80fac5400a0c1da208d11ee340e4c0db4bb73f33986e971cb45b5cf442a"} Dec 05 15:57:53 crc kubenswrapper[4778]: I1205 15:57:53.425425 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.425408735 podStartE2EDuration="3.425408735s" podCreationTimestamp="2025-12-05 15:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:53.418719521 +0000 UTC m=+160.522515891" watchObservedRunningTime="2025-12-05 15:57:53.425408735 +0000 UTC m=+160.529205115" Dec 05 15:57:53 crc kubenswrapper[4778]: I1205 15:57:53.430217 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.4301980260000002 podStartE2EDuration="3.430198026s" podCreationTimestamp="2025-12-05 15:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:53.428189091 +0000 UTC m=+160.531985471" watchObservedRunningTime="2025-12-05 15:57:53.430198026 +0000 UTC m=+160.533994406" Dec 05 15:57:54 crc kubenswrapper[4778]: I1205 15:57:54.049132 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:54 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:54 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:54 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:54 crc kubenswrapper[4778]: I1205 15:57:54.049200 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:54 crc kubenswrapper[4778]: I1205 15:57:54.423021 4778 generic.go:334] "Generic (PLEG): container finished" podID="2bf1e687-285f-496a-b75d-7f6cae5088f2" containerID="48e6f80fac5400a0c1da208d11ee340e4c0db4bb73f33986e971cb45b5cf442a" exitCode=0 Dec 05 15:57:54 crc kubenswrapper[4778]: I1205 15:57:54.423083 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2bf1e687-285f-496a-b75d-7f6cae5088f2","Type":"ContainerDied","Data":"48e6f80fac5400a0c1da208d11ee340e4c0db4bb73f33986e971cb45b5cf442a"} Dec 05 15:57:54 crc kubenswrapper[4778]: I1205 15:57:54.425512 4778 generic.go:334] "Generic (PLEG): container finished" podID="cc98d520-7ab5-4623-91d6-066f4390100f" containerID="22b311ef20d9f19b244f1b1f618c9b58fc01b9f53bd44d64c4cf9a98673151fd" exitCode=0 Dec 05 15:57:54 crc kubenswrapper[4778]: I1205 15:57:54.425561 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc98d520-7ab5-4623-91d6-066f4390100f","Type":"ContainerDied","Data":"22b311ef20d9f19b244f1b1f618c9b58fc01b9f53bd44d64c4cf9a98673151fd"} Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.048535 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:55 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:55 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:55 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.048596 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.335100 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.346223 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48cc0dd1-7387-4df1-aa6a-198ac40c620d-metrics-certs\") pod \"network-metrics-daemon-8tvxd\" (UID: \"48cc0dd1-7387-4df1-aa6a-198ac40c620d\") " pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.413418 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tvxd" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.693453 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.761864 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8tvxd"] Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.766903 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 15:57:55 crc kubenswrapper[4778]: W1205 15:57:55.769312 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48cc0dd1_7387_4df1_aa6a_198ac40c620d.slice/crio-0cc471260f4266ca549c72e19345a331df8eb1e74737c1b34fd1d88b82b11abf WatchSource:0}: Error finding container 0cc471260f4266ca549c72e19345a331df8eb1e74737c1b34fd1d88b82b11abf: Status 404 returned error can't find the container with id 0cc471260f4266ca549c72e19345a331df8eb1e74737c1b34fd1d88b82b11abf Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.815194 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nf9v5" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.845483 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2bf1e687-285f-496a-b75d-7f6cae5088f2-kube-api-access\") pod \"2bf1e687-285f-496a-b75d-7f6cae5088f2\" (UID: \"2bf1e687-285f-496a-b75d-7f6cae5088f2\") " Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.845583 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc98d520-7ab5-4623-91d6-066f4390100f-kubelet-dir\") pod \"cc98d520-7ab5-4623-91d6-066f4390100f\" (UID: \"cc98d520-7ab5-4623-91d6-066f4390100f\") " Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.845630 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc98d520-7ab5-4623-91d6-066f4390100f-kube-api-access\") pod \"cc98d520-7ab5-4623-91d6-066f4390100f\" (UID: \"cc98d520-7ab5-4623-91d6-066f4390100f\") " Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.845702 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2bf1e687-285f-496a-b75d-7f6cae5088f2-kubelet-dir\") pod \"2bf1e687-285f-496a-b75d-7f6cae5088f2\" (UID: \"2bf1e687-285f-496a-b75d-7f6cae5088f2\") " Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.845701 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc98d520-7ab5-4623-91d6-066f4390100f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cc98d520-7ab5-4623-91d6-066f4390100f" (UID: "cc98d520-7ab5-4623-91d6-066f4390100f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.845940 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc98d520-7ab5-4623-91d6-066f4390100f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.845971 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bf1e687-285f-496a-b75d-7f6cae5088f2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2bf1e687-285f-496a-b75d-7f6cae5088f2" (UID: "2bf1e687-285f-496a-b75d-7f6cae5088f2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.849049 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bf1e687-285f-496a-b75d-7f6cae5088f2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2bf1e687-285f-496a-b75d-7f6cae5088f2" (UID: "2bf1e687-285f-496a-b75d-7f6cae5088f2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.850340 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc98d520-7ab5-4623-91d6-066f4390100f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cc98d520-7ab5-4623-91d6-066f4390100f" (UID: "cc98d520-7ab5-4623-91d6-066f4390100f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.948008 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc98d520-7ab5-4623-91d6-066f4390100f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.948157 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2bf1e687-285f-496a-b75d-7f6cae5088f2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 15:57:55 crc kubenswrapper[4778]: I1205 15:57:55.948191 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2bf1e687-285f-496a-b75d-7f6cae5088f2-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 15:57:56 crc kubenswrapper[4778]: I1205 15:57:56.049513 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:56 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:56 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:56 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:56 crc kubenswrapper[4778]: I1205 15:57:56.049757 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:56 crc kubenswrapper[4778]: I1205 15:57:56.444784 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2bf1e687-285f-496a-b75d-7f6cae5088f2","Type":"ContainerDied","Data":"eefd46356de5c082a4217f5adaed6abeb235bab79720c2473b88ccf0990550cc"} Dec 05 15:57:56 crc kubenswrapper[4778]: I1205 15:57:56.445124 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eefd46356de5c082a4217f5adaed6abeb235bab79720c2473b88ccf0990550cc" Dec 05 15:57:56 crc kubenswrapper[4778]: I1205 15:57:56.444839 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 15:57:56 crc kubenswrapper[4778]: I1205 15:57:56.447062 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 15:57:56 crc kubenswrapper[4778]: I1205 15:57:56.447061 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc98d520-7ab5-4623-91d6-066f4390100f","Type":"ContainerDied","Data":"5a9f9c5640728b7829b64ad34a8cd93373f89e89a11d867dc5ff40d8ea100690"} Dec 05 15:57:56 crc kubenswrapper[4778]: I1205 15:57:56.447166 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a9f9c5640728b7829b64ad34a8cd93373f89e89a11d867dc5ff40d8ea100690" Dec 05 15:57:56 crc kubenswrapper[4778]: I1205 15:57:56.448339 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" event={"ID":"48cc0dd1-7387-4df1-aa6a-198ac40c620d","Type":"ContainerStarted","Data":"0cc471260f4266ca549c72e19345a331df8eb1e74737c1b34fd1d88b82b11abf"} Dec 05 15:57:57 crc kubenswrapper[4778]: I1205 15:57:57.049557 4778 patch_prober.go:28] interesting pod/router-default-5444994796-zzpst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 15:57:57 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Dec 05 15:57:57 crc kubenswrapper[4778]: [+]process-running ok Dec 05 15:57:57 crc kubenswrapper[4778]: healthz check failed Dec 05 15:57:57 crc kubenswrapper[4778]: I1205 15:57:57.049620 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zzpst" podUID="168fbd70-f065-4a1d-965a-c1d67493a528" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 15:57:57 crc kubenswrapper[4778]: I1205 15:57:57.454510 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" event={"ID":"48cc0dd1-7387-4df1-aa6a-198ac40c620d","Type":"ContainerStarted","Data":"d6463824380ac68e02c2a63b88a2ea3d63035cd35b62d60d13f568e8185be83c"} Dec 05 15:57:58 crc kubenswrapper[4778]: I1205 15:57:58.048734 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:58 crc kubenswrapper[4778]: I1205 15:57:58.053358 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-zzpst" Dec 05 15:57:58 crc kubenswrapper[4778]: I1205 15:57:58.463156 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8tvxd" event={"ID":"48cc0dd1-7387-4df1-aa6a-198ac40c620d","Type":"ContainerStarted","Data":"5a3b0ee0ad6a663512fb820b7acfeb2b09ccd38b678fda51c1d829ec67ac058b"} Dec 05 15:57:59 crc kubenswrapper[4778]: I1205 15:57:59.481837 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8tvxd" podStartSLOduration=147.481816846 podStartE2EDuration="2m27.481816846s" podCreationTimestamp="2025-12-05 15:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:57:59.480654333 +0000 UTC m=+166.584450713" watchObservedRunningTime="2025-12-05 15:57:59.481816846 +0000 UTC m=+166.585613236" Dec 05 15:58:00 crc kubenswrapper[4778]: I1205 15:58:00.159016 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:58:00 crc kubenswrapper[4778]: I1205 15:58:00.163151 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-gnnls" Dec 05 15:58:03 crc kubenswrapper[4778]: I1205 15:58:03.415018 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 15:58:03 crc kubenswrapper[4778]: I1205 15:58:03.415098 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 15:58:07 crc kubenswrapper[4778]: I1205 15:58:07.700278 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 15:58:09 crc kubenswrapper[4778]: I1205 15:58:09.275336 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 15:58:20 crc kubenswrapper[4778]: I1205 15:58:20.433519 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4mr8q" Dec 05 15:58:21 crc kubenswrapper[4778]: E1205 15:58:21.468322 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 15:58:21 crc kubenswrapper[4778]: E1205 15:58:21.468769 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zd46l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-77q8g_openshift-marketplace(31eaa34f-a155-495d-a833-a54ba9546a1a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 15:58:21 crc kubenswrapper[4778]: E1205 15:58:21.470429 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-77q8g" podUID="31eaa34f-a155-495d-a833-a54ba9546a1a" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.525010 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 15:58:23 crc kubenswrapper[4778]: E1205 15:58:23.525537 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf1e687-285f-496a-b75d-7f6cae5088f2" containerName="pruner" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.525572 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf1e687-285f-496a-b75d-7f6cae5088f2" containerName="pruner" Dec 05 15:58:23 crc kubenswrapper[4778]: E1205 15:58:23.525593 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc98d520-7ab5-4623-91d6-066f4390100f" containerName="pruner" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.525609 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc98d520-7ab5-4623-91d6-066f4390100f" containerName="pruner" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.525840 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bf1e687-285f-496a-b75d-7f6cae5088f2" containerName="pruner" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.525879 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc98d520-7ab5-4623-91d6-066f4390100f" containerName="pruner" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.526715 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.530481 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.530990 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.531023 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.667800 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7e6f140-3f61-4e58-90fb-cc06884e541e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a7e6f140-3f61-4e58-90fb-cc06884e541e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.667932 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7e6f140-3f61-4e58-90fb-cc06884e541e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a7e6f140-3f61-4e58-90fb-cc06884e541e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.769067 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7e6f140-3f61-4e58-90fb-cc06884e541e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a7e6f140-3f61-4e58-90fb-cc06884e541e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.769188 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7e6f140-3f61-4e58-90fb-cc06884e541e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a7e6f140-3f61-4e58-90fb-cc06884e541e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.769223 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7e6f140-3f61-4e58-90fb-cc06884e541e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a7e6f140-3f61-4e58-90fb-cc06884e541e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.788552 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7e6f140-3f61-4e58-90fb-cc06884e541e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a7e6f140-3f61-4e58-90fb-cc06884e541e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 15:58:23 crc kubenswrapper[4778]: I1205 15:58:23.952349 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 15:58:26 crc kubenswrapper[4778]: E1205 15:58:26.568280 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-77q8g" podUID="31eaa34f-a155-495d-a833-a54ba9546a1a" Dec 05 15:58:26 crc kubenswrapper[4778]: E1205 15:58:26.796967 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 15:58:26 crc kubenswrapper[4778]: E1205 15:58:26.797913 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6xht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xr8gn_openshift-marketplace(866ee44d-6331-47f7-8452-be6f25815a1e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 15:58:26 crc kubenswrapper[4778]: E1205 15:58:26.799207 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xr8gn" podUID="866ee44d-6331-47f7-8452-be6f25815a1e" Dec 05 15:58:28 crc kubenswrapper[4778]: I1205 15:58:28.521683 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 15:58:28 crc kubenswrapper[4778]: I1205 15:58:28.524569 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:58:28 crc kubenswrapper[4778]: I1205 15:58:28.526470 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 15:58:28 crc kubenswrapper[4778]: I1205 15:58:28.657650 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2ebd611-e2d6-45da-a673-29cae7deb0c1-kube-api-access\") pod \"installer-9-crc\" (UID: \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:58:28 crc kubenswrapper[4778]: I1205 15:58:28.657736 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2ebd611-e2d6-45da-a673-29cae7deb0c1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:58:28 crc kubenswrapper[4778]: I1205 15:58:28.658032 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b2ebd611-e2d6-45da-a673-29cae7deb0c1-var-lock\") pod \"installer-9-crc\" (UID: \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:58:28 crc kubenswrapper[4778]: I1205 15:58:28.759677 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2ebd611-e2d6-45da-a673-29cae7deb0c1-kube-api-access\") pod \"installer-9-crc\" (UID: \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:58:28 crc kubenswrapper[4778]: I1205 15:58:28.759727 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2ebd611-e2d6-45da-a673-29cae7deb0c1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:58:28 crc kubenswrapper[4778]: I1205 15:58:28.759780 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b2ebd611-e2d6-45da-a673-29cae7deb0c1-var-lock\") pod \"installer-9-crc\" (UID: \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:58:28 crc kubenswrapper[4778]: I1205 15:58:28.759875 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b2ebd611-e2d6-45da-a673-29cae7deb0c1-var-lock\") pod \"installer-9-crc\" (UID: \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:58:28 crc kubenswrapper[4778]: I1205 15:58:28.759920 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2ebd611-e2d6-45da-a673-29cae7deb0c1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:58:29 crc kubenswrapper[4778]: I1205 15:58:29.215444 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2ebd611-e2d6-45da-a673-29cae7deb0c1-kube-api-access\") pod \"installer-9-crc\" (UID: \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:58:29 crc kubenswrapper[4778]: E1205 15:58:29.232004 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xr8gn" podUID="866ee44d-6331-47f7-8452-be6f25815a1e" Dec 05 15:58:29 crc kubenswrapper[4778]: E1205 15:58:29.345953 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 15:58:29 crc kubenswrapper[4778]: E1205 15:58:29.346433 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wg6q8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fj8nh_openshift-marketplace(4047fee6-0560-4d41-8212-aa284021dff0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 15:58:29 crc kubenswrapper[4778]: E1205 15:58:29.347953 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fj8nh" podUID="4047fee6-0560-4d41-8212-aa284021dff0" Dec 05 15:58:29 crc kubenswrapper[4778]: I1205 15:58:29.463016 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.792236 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-fj8nh" podUID="4047fee6-0560-4d41-8212-aa284021dff0" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.864961 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.865492 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-knrld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-wkk25_openshift-marketplace(8cff4721-de89-49fa-9f19-682ec8ae8e64): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.866829 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-wkk25" podUID="8cff4721-de89-49fa-9f19-682ec8ae8e64" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.887416 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.887617 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzhfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7jmzd_openshift-marketplace(840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.889294 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7jmzd" podUID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.897427 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.897903 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qwm9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gwxnr_openshift-marketplace(70c6979b-453d-49a4-889e-e46eff9af778): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.898988 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gwxnr" podUID="70c6979b-453d-49a4-889e-e46eff9af778" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.905140 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.905273 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f2fp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tlnh4_openshift-marketplace(eb1076bb-639d-42e5-ab8c-d13eb121cc95): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.906893 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tlnh4" podUID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.936854 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.937047 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xvk7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-plzrl_openshift-marketplace(e5267d1d-ec1f-461d-acdc-57303aac7015): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 15:58:30 crc kubenswrapper[4778]: E1205 15:58:30.938235 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-plzrl" podUID="e5267d1d-ec1f-461d-acdc-57303aac7015" Dec 05 15:58:31 crc kubenswrapper[4778]: I1205 15:58:31.210922 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 15:58:31 crc kubenswrapper[4778]: W1205 15:58:31.218737 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb2ebd611_e2d6_45da_a673_29cae7deb0c1.slice/crio-b3a884fa8e4a33d77c8d8c6538d591074fd713b7a70fcd8a2cfe6fe2f15c57c0 WatchSource:0}: Error finding container b3a884fa8e4a33d77c8d8c6538d591074fd713b7a70fcd8a2cfe6fe2f15c57c0: Status 404 returned error can't find the container with id b3a884fa8e4a33d77c8d8c6538d591074fd713b7a70fcd8a2cfe6fe2f15c57c0 Dec 05 15:58:31 crc kubenswrapper[4778]: I1205 15:58:31.255952 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 15:58:31 crc kubenswrapper[4778]: W1205 15:58:31.262039 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda7e6f140_3f61_4e58_90fb_cc06884e541e.slice/crio-566704883e19399ad9773ef1ce53380961599b40867d3b0c862ffc9c5e990334 WatchSource:0}: Error finding container 566704883e19399ad9773ef1ce53380961599b40867d3b0c862ffc9c5e990334: Status 404 returned error can't find the container with id 566704883e19399ad9773ef1ce53380961599b40867d3b0c862ffc9c5e990334 Dec 05 15:58:31 crc kubenswrapper[4778]: I1205 15:58:31.651864 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b2ebd611-e2d6-45da-a673-29cae7deb0c1","Type":"ContainerStarted","Data":"83b789da940bddb5bfb016b94078beb1b5deddcb5792a2e6b12a4e8633948725"} Dec 05 15:58:31 crc kubenswrapper[4778]: I1205 15:58:31.652252 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b2ebd611-e2d6-45da-a673-29cae7deb0c1","Type":"ContainerStarted","Data":"b3a884fa8e4a33d77c8d8c6538d591074fd713b7a70fcd8a2cfe6fe2f15c57c0"} Dec 05 15:58:31 crc kubenswrapper[4778]: I1205 15:58:31.654533 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a7e6f140-3f61-4e58-90fb-cc06884e541e","Type":"ContainerStarted","Data":"f1b205c347350a8fb2b67db72db3f5a8c40a18c9307b5b0175f87fc4ae4edf36"} Dec 05 15:58:31 crc kubenswrapper[4778]: I1205 15:58:31.654580 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a7e6f140-3f61-4e58-90fb-cc06884e541e","Type":"ContainerStarted","Data":"566704883e19399ad9773ef1ce53380961599b40867d3b0c862ffc9c5e990334"} Dec 05 15:58:31 crc kubenswrapper[4778]: E1205 15:58:31.655762 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-plzrl" podUID="e5267d1d-ec1f-461d-acdc-57303aac7015" Dec 05 15:58:31 crc kubenswrapper[4778]: E1205 15:58:31.656133 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gwxnr" podUID="70c6979b-453d-49a4-889e-e46eff9af778" Dec 05 15:58:31 crc kubenswrapper[4778]: E1205 15:58:31.656402 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7jmzd" podUID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" Dec 05 15:58:31 crc kubenswrapper[4778]: E1205 15:58:31.656432 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tlnh4" podUID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" Dec 05 15:58:31 crc kubenswrapper[4778]: E1205 15:58:31.656806 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-wkk25" podUID="8cff4721-de89-49fa-9f19-682ec8ae8e64" Dec 05 15:58:31 crc kubenswrapper[4778]: I1205 15:58:31.668165 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.668143989 podStartE2EDuration="3.668143989s" podCreationTimestamp="2025-12-05 15:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:58:31.667056349 +0000 UTC m=+198.770852729" watchObservedRunningTime="2025-12-05 15:58:31.668143989 +0000 UTC m=+198.771940379" Dec 05 15:58:31 crc kubenswrapper[4778]: I1205 15:58:31.751618 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=8.751590208 podStartE2EDuration="8.751590208s" podCreationTimestamp="2025-12-05 15:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:58:31.750812807 +0000 UTC m=+198.854609197" watchObservedRunningTime="2025-12-05 15:58:31.751590208 +0000 UTC m=+198.855386598" Dec 05 15:58:32 crc kubenswrapper[4778]: I1205 15:58:32.661068 4778 generic.go:334] "Generic (PLEG): container finished" podID="a7e6f140-3f61-4e58-90fb-cc06884e541e" containerID="f1b205c347350a8fb2b67db72db3f5a8c40a18c9307b5b0175f87fc4ae4edf36" exitCode=0 Dec 05 15:58:32 crc kubenswrapper[4778]: I1205 15:58:32.661471 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a7e6f140-3f61-4e58-90fb-cc06884e541e","Type":"ContainerDied","Data":"f1b205c347350a8fb2b67db72db3f5a8c40a18c9307b5b0175f87fc4ae4edf36"} Dec 05 15:58:33 crc kubenswrapper[4778]: I1205 15:58:33.416979 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 15:58:33 crc kubenswrapper[4778]: I1205 15:58:33.418511 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 15:58:33 crc kubenswrapper[4778]: I1205 15:58:33.419472 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 15:58:33 crc kubenswrapper[4778]: I1205 15:58:33.423899 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 15:58:33 crc kubenswrapper[4778]: I1205 15:58:33.424047 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07" gracePeriod=600 Dec 05 15:58:33 crc kubenswrapper[4778]: I1205 15:58:33.668148 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07" exitCode=0 Dec 05 15:58:33 crc kubenswrapper[4778]: I1205 15:58:33.668356 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07"} Dec 05 15:58:33 crc kubenswrapper[4778]: I1205 15:58:33.904925 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 15:58:34 crc kubenswrapper[4778]: I1205 15:58:34.028698 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7e6f140-3f61-4e58-90fb-cc06884e541e-kubelet-dir\") pod \"a7e6f140-3f61-4e58-90fb-cc06884e541e\" (UID: \"a7e6f140-3f61-4e58-90fb-cc06884e541e\") " Dec 05 15:58:34 crc kubenswrapper[4778]: I1205 15:58:34.028756 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7e6f140-3f61-4e58-90fb-cc06884e541e-kube-api-access\") pod \"a7e6f140-3f61-4e58-90fb-cc06884e541e\" (UID: \"a7e6f140-3f61-4e58-90fb-cc06884e541e\") " Dec 05 15:58:34 crc kubenswrapper[4778]: I1205 15:58:34.028835 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7e6f140-3f61-4e58-90fb-cc06884e541e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a7e6f140-3f61-4e58-90fb-cc06884e541e" (UID: "a7e6f140-3f61-4e58-90fb-cc06884e541e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:58:34 crc kubenswrapper[4778]: I1205 15:58:34.029056 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7e6f140-3f61-4e58-90fb-cc06884e541e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 15:58:34 crc kubenswrapper[4778]: I1205 15:58:34.037656 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e6f140-3f61-4e58-90fb-cc06884e541e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a7e6f140-3f61-4e58-90fb-cc06884e541e" (UID: "a7e6f140-3f61-4e58-90fb-cc06884e541e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:58:34 crc kubenswrapper[4778]: I1205 15:58:34.129920 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7e6f140-3f61-4e58-90fb-cc06884e541e-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 15:58:34 crc kubenswrapper[4778]: I1205 15:58:34.679719 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"3d94c7d5a3c87642bb8001766b12e02cd4c800446b73b1de7ad07648c1824c6e"} Dec 05 15:58:34 crc kubenswrapper[4778]: I1205 15:58:34.686600 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a7e6f140-3f61-4e58-90fb-cc06884e541e","Type":"ContainerDied","Data":"566704883e19399ad9773ef1ce53380961599b40867d3b0c862ffc9c5e990334"} Dec 05 15:58:34 crc kubenswrapper[4778]: I1205 15:58:34.686661 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="566704883e19399ad9773ef1ce53380961599b40867d3b0c862ffc9c5e990334" Dec 05 15:58:34 crc kubenswrapper[4778]: I1205 15:58:34.686749 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 15:58:38 crc kubenswrapper[4778]: I1205 15:58:38.706633 4778 generic.go:334] "Generic (PLEG): container finished" podID="31eaa34f-a155-495d-a833-a54ba9546a1a" containerID="4816909baceeb54fbf5b5b537fc486a1e7493b1fe9d81d9f9e2226d40e260b81" exitCode=0 Dec 05 15:58:38 crc kubenswrapper[4778]: I1205 15:58:38.706716 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77q8g" event={"ID":"31eaa34f-a155-495d-a833-a54ba9546a1a","Type":"ContainerDied","Data":"4816909baceeb54fbf5b5b537fc486a1e7493b1fe9d81d9f9e2226d40e260b81"} Dec 05 15:58:39 crc kubenswrapper[4778]: I1205 15:58:39.715922 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77q8g" event={"ID":"31eaa34f-a155-495d-a833-a54ba9546a1a","Type":"ContainerStarted","Data":"27c64f0021c8b4b20336d176e6e4f8abd68b2f9637e76496338585213b5636f5"} Dec 05 15:58:39 crc kubenswrapper[4778]: I1205 15:58:39.737102 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-77q8g" podStartSLOduration=2.68295603 podStartE2EDuration="54.737078211s" podCreationTimestamp="2025-12-05 15:57:45 +0000 UTC" firstStartedPulling="2025-12-05 15:57:47.223742103 +0000 UTC m=+154.327538483" lastFinishedPulling="2025-12-05 15:58:39.277864284 +0000 UTC m=+206.381660664" observedRunningTime="2025-12-05 15:58:39.73520693 +0000 UTC m=+206.839003320" watchObservedRunningTime="2025-12-05 15:58:39.737078211 +0000 UTC m=+206.840874631" Dec 05 15:58:42 crc kubenswrapper[4778]: I1205 15:58:42.734025 4778 generic.go:334] "Generic (PLEG): container finished" podID="4047fee6-0560-4d41-8212-aa284021dff0" containerID="ea97297539a10c093541d345debc363b59b26911c1f7aba1befd371311b2d154" exitCode=0 Dec 05 15:58:42 crc kubenswrapper[4778]: I1205 15:58:42.734114 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj8nh" event={"ID":"4047fee6-0560-4d41-8212-aa284021dff0","Type":"ContainerDied","Data":"ea97297539a10c093541d345debc363b59b26911c1f7aba1befd371311b2d154"} Dec 05 15:58:43 crc kubenswrapper[4778]: I1205 15:58:43.741959 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj8nh" event={"ID":"4047fee6-0560-4d41-8212-aa284021dff0","Type":"ContainerStarted","Data":"3e9d7d97beb14e503997cc82982bbfc337a58287d0b3f926d663dbba6ab82920"} Dec 05 15:58:43 crc kubenswrapper[4778]: I1205 15:58:43.761467 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fj8nh" podStartSLOduration=2.891287947 podStartE2EDuration="56.761440139s" podCreationTimestamp="2025-12-05 15:57:47 +0000 UTC" firstStartedPulling="2025-12-05 15:57:49.295117033 +0000 UTC m=+156.398913413" lastFinishedPulling="2025-12-05 15:58:43.165269225 +0000 UTC m=+210.269065605" observedRunningTime="2025-12-05 15:58:43.760633847 +0000 UTC m=+210.864430247" watchObservedRunningTime="2025-12-05 15:58:43.761440139 +0000 UTC m=+210.865236519" Dec 05 15:58:44 crc kubenswrapper[4778]: I1205 15:58:44.751482 4778 generic.go:334] "Generic (PLEG): container finished" podID="e5267d1d-ec1f-461d-acdc-57303aac7015" containerID="2ec4ca6a996812bc32c022260faddbf1df943351747bde3aee192bc0d228f623" exitCode=0 Dec 05 15:58:44 crc kubenswrapper[4778]: I1205 15:58:44.751558 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plzrl" event={"ID":"e5267d1d-ec1f-461d-acdc-57303aac7015","Type":"ContainerDied","Data":"2ec4ca6a996812bc32c022260faddbf1df943351747bde3aee192bc0d228f623"} Dec 05 15:58:44 crc kubenswrapper[4778]: I1205 15:58:44.754438 4778 generic.go:334] "Generic (PLEG): container finished" podID="70c6979b-453d-49a4-889e-e46eff9af778" containerID="77ae6602820cc302573f8db0f1e03119b2b0ad85873854d51cfcfdcdb3c81250" exitCode=0 Dec 05 15:58:44 crc kubenswrapper[4778]: I1205 15:58:44.754602 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwxnr" event={"ID":"70c6979b-453d-49a4-889e-e46eff9af778","Type":"ContainerDied","Data":"77ae6602820cc302573f8db0f1e03119b2b0ad85873854d51cfcfdcdb3c81250"} Dec 05 15:58:45 crc kubenswrapper[4778]: I1205 15:58:45.762283 4778 generic.go:334] "Generic (PLEG): container finished" podID="866ee44d-6331-47f7-8452-be6f25815a1e" containerID="a0da7c4d8e7e54594440893ea856df165eedd6b87a2d303eeb67ea419cfa706f" exitCode=0 Dec 05 15:58:45 crc kubenswrapper[4778]: I1205 15:58:45.762605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr8gn" event={"ID":"866ee44d-6331-47f7-8452-be6f25815a1e","Type":"ContainerDied","Data":"a0da7c4d8e7e54594440893ea856df165eedd6b87a2d303eeb67ea419cfa706f"} Dec 05 15:58:45 crc kubenswrapper[4778]: I1205 15:58:45.765704 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwxnr" event={"ID":"70c6979b-453d-49a4-889e-e46eff9af778","Type":"ContainerStarted","Data":"b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6"} Dec 05 15:58:45 crc kubenswrapper[4778]: I1205 15:58:45.767744 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plzrl" event={"ID":"e5267d1d-ec1f-461d-acdc-57303aac7015","Type":"ContainerStarted","Data":"4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17"} Dec 05 15:58:45 crc kubenswrapper[4778]: I1205 15:58:45.770965 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jmzd" event={"ID":"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa","Type":"ContainerStarted","Data":"07e27d6e1873a5a1c47e56d56454afe6a87a3370eb73f67753e0528c6171eeb1"} Dec 05 15:58:45 crc kubenswrapper[4778]: I1205 15:58:45.818289 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-plzrl" podStartSLOduration=2.915060178 podStartE2EDuration="1m0.818269763s" podCreationTimestamp="2025-12-05 15:57:45 +0000 UTC" firstStartedPulling="2025-12-05 15:57:47.22505119 +0000 UTC m=+154.328847570" lastFinishedPulling="2025-12-05 15:58:45.128260775 +0000 UTC m=+212.232057155" observedRunningTime="2025-12-05 15:58:45.813357279 +0000 UTC m=+212.917153659" watchObservedRunningTime="2025-12-05 15:58:45.818269763 +0000 UTC m=+212.922066133" Dec 05 15:58:45 crc kubenswrapper[4778]: I1205 15:58:45.830313 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gwxnr" podStartSLOduration=1.856699686 podStartE2EDuration="58.830298334s" podCreationTimestamp="2025-12-05 15:57:47 +0000 UTC" firstStartedPulling="2025-12-05 15:57:48.286460262 +0000 UTC m=+155.390256632" lastFinishedPulling="2025-12-05 15:58:45.2600589 +0000 UTC m=+212.363855280" observedRunningTime="2025-12-05 15:58:45.82907391 +0000 UTC m=+212.932870290" watchObservedRunningTime="2025-12-05 15:58:45.830298334 +0000 UTC m=+212.934094714" Dec 05 15:58:46 crc kubenswrapper[4778]: I1205 15:58:46.096851 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:58:46 crc kubenswrapper[4778]: I1205 15:58:46.096924 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:58:46 crc kubenswrapper[4778]: I1205 15:58:46.190127 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:58:46 crc kubenswrapper[4778]: I1205 15:58:46.779120 4778 generic.go:334] "Generic (PLEG): container finished" podID="8cff4721-de89-49fa-9f19-682ec8ae8e64" containerID="dbdd53eaaaf60dcfecd0142cbcf03513d4ab3d32d7ecf2907c470c5b737d5fa9" exitCode=0 Dec 05 15:58:46 crc kubenswrapper[4778]: I1205 15:58:46.779229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk25" event={"ID":"8cff4721-de89-49fa-9f19-682ec8ae8e64","Type":"ContainerDied","Data":"dbdd53eaaaf60dcfecd0142cbcf03513d4ab3d32d7ecf2907c470c5b737d5fa9"} Dec 05 15:58:46 crc kubenswrapper[4778]: I1205 15:58:46.782829 4778 generic.go:334] "Generic (PLEG): container finished" podID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" containerID="b95e816204bcbcf9dad9e4cd1d127c73954e4e6161be032395b1722c8fd0087c" exitCode=0 Dec 05 15:58:46 crc kubenswrapper[4778]: I1205 15:58:46.782887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlnh4" event={"ID":"eb1076bb-639d-42e5-ab8c-d13eb121cc95","Type":"ContainerDied","Data":"b95e816204bcbcf9dad9e4cd1d127c73954e4e6161be032395b1722c8fd0087c"} Dec 05 15:58:46 crc kubenswrapper[4778]: I1205 15:58:46.784809 4778 generic.go:334] "Generic (PLEG): container finished" podID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" containerID="07e27d6e1873a5a1c47e56d56454afe6a87a3370eb73f67753e0528c6171eeb1" exitCode=0 Dec 05 15:58:46 crc kubenswrapper[4778]: I1205 15:58:46.784871 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jmzd" event={"ID":"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa","Type":"ContainerDied","Data":"07e27d6e1873a5a1c47e56d56454afe6a87a3370eb73f67753e0528c6171eeb1"} Dec 05 15:58:46 crc kubenswrapper[4778]: I1205 15:58:46.787002 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr8gn" event={"ID":"866ee44d-6331-47f7-8452-be6f25815a1e","Type":"ContainerStarted","Data":"40853ae3f7aa925dc41cfad39bac91f0c9ef6104f317b0f69e4808fcafc6740f"} Dec 05 15:58:46 crc kubenswrapper[4778]: I1205 15:58:46.855142 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:58:46 crc kubenswrapper[4778]: I1205 15:58:46.857900 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xr8gn" podStartSLOduration=3.018159702 podStartE2EDuration="58.857885762s" podCreationTimestamp="2025-12-05 15:57:48 +0000 UTC" firstStartedPulling="2025-12-05 15:57:50.302245054 +0000 UTC m=+157.406041434" lastFinishedPulling="2025-12-05 15:58:46.141971114 +0000 UTC m=+213.245767494" observedRunningTime="2025-12-05 15:58:46.85707389 +0000 UTC m=+213.960870270" watchObservedRunningTime="2025-12-05 15:58:46.857885762 +0000 UTC m=+213.961682142" Dec 05 15:58:47 crc kubenswrapper[4778]: I1205 15:58:47.491992 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:58:47 crc kubenswrapper[4778]: I1205 15:58:47.492070 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:58:47 crc kubenswrapper[4778]: I1205 15:58:47.546396 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:58:47 crc kubenswrapper[4778]: I1205 15:58:47.793129 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlnh4" event={"ID":"eb1076bb-639d-42e5-ab8c-d13eb121cc95","Type":"ContainerStarted","Data":"6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74"} Dec 05 15:58:47 crc kubenswrapper[4778]: I1205 15:58:47.795967 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jmzd" event={"ID":"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa","Type":"ContainerStarted","Data":"af7ad15e913ea74bb55f9c4267abbd23e823e4363a160dd91cd56f40ce594b4f"} Dec 05 15:58:47 crc kubenswrapper[4778]: I1205 15:58:47.797947 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk25" event={"ID":"8cff4721-de89-49fa-9f19-682ec8ae8e64","Type":"ContainerStarted","Data":"54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae"} Dec 05 15:58:47 crc kubenswrapper[4778]: I1205 15:58:47.818041 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tlnh4" podStartSLOduration=2.7289040460000002 podStartE2EDuration="59.818013321s" podCreationTimestamp="2025-12-05 15:57:48 +0000 UTC" firstStartedPulling="2025-12-05 15:57:50.311861526 +0000 UTC m=+157.415657906" lastFinishedPulling="2025-12-05 15:58:47.400970801 +0000 UTC m=+214.504767181" observedRunningTime="2025-12-05 15:58:47.815190824 +0000 UTC m=+214.918987204" watchObservedRunningTime="2025-12-05 15:58:47.818013321 +0000 UTC m=+214.921809711" Dec 05 15:58:47 crc kubenswrapper[4778]: I1205 15:58:47.838922 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wkk25" podStartSLOduration=2.8964839380000003 podStartE2EDuration="1m2.838905205s" podCreationTimestamp="2025-12-05 15:57:45 +0000 UTC" firstStartedPulling="2025-12-05 15:57:47.229536592 +0000 UTC m=+154.333332962" lastFinishedPulling="2025-12-05 15:58:47.171957849 +0000 UTC m=+214.275754229" observedRunningTime="2025-12-05 15:58:47.835566433 +0000 UTC m=+214.939362813" watchObservedRunningTime="2025-12-05 15:58:47.838905205 +0000 UTC m=+214.942701585" Dec 05 15:58:47 crc kubenswrapper[4778]: I1205 15:58:47.855938 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7jmzd" podStartSLOduration=2.8135971250000003 podStartE2EDuration="1m2.855919172s" podCreationTimestamp="2025-12-05 15:57:45 +0000 UTC" firstStartedPulling="2025-12-05 15:57:47.220967147 +0000 UTC m=+154.324763527" lastFinishedPulling="2025-12-05 15:58:47.263289194 +0000 UTC m=+214.367085574" observedRunningTime="2025-12-05 15:58:47.855549602 +0000 UTC m=+214.959346002" watchObservedRunningTime="2025-12-05 15:58:47.855919172 +0000 UTC m=+214.959715552" Dec 05 15:58:47 crc kubenswrapper[4778]: I1205 15:58:47.932136 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:58:47 crc kubenswrapper[4778]: I1205 15:58:47.932457 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:58:47 crc kubenswrapper[4778]: I1205 15:58:47.982807 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:58:48 crc kubenswrapper[4778]: I1205 15:58:48.562150 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77q8g"] Dec 05 15:58:48 crc kubenswrapper[4778]: I1205 15:58:48.711848 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:58:48 crc kubenswrapper[4778]: I1205 15:58:48.711945 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:58:48 crc kubenswrapper[4778]: I1205 15:58:48.804291 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-77q8g" podUID="31eaa34f-a155-495d-a833-a54ba9546a1a" containerName="registry-server" containerID="cri-o://27c64f0021c8b4b20336d176e6e4f8abd68b2f9637e76496338585213b5636f5" gracePeriod=2 Dec 05 15:58:48 crc kubenswrapper[4778]: I1205 15:58:48.853415 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:58:49 crc kubenswrapper[4778]: I1205 15:58:49.099256 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:58:49 crc kubenswrapper[4778]: I1205 15:58:49.099573 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:58:49 crc kubenswrapper[4778]: I1205 15:58:49.568041 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj8nh"] Dec 05 15:58:49 crc kubenswrapper[4778]: I1205 15:58:49.755096 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tlnh4" podUID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" containerName="registry-server" probeResult="failure" output=< Dec 05 15:58:49 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Dec 05 15:58:49 crc kubenswrapper[4778]: > Dec 05 15:58:49 crc kubenswrapper[4778]: I1205 15:58:49.816033 4778 generic.go:334] "Generic (PLEG): container finished" podID="31eaa34f-a155-495d-a833-a54ba9546a1a" containerID="27c64f0021c8b4b20336d176e6e4f8abd68b2f9637e76496338585213b5636f5" exitCode=0 Dec 05 15:58:49 crc kubenswrapper[4778]: I1205 15:58:49.816674 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77q8g" event={"ID":"31eaa34f-a155-495d-a833-a54ba9546a1a","Type":"ContainerDied","Data":"27c64f0021c8b4b20336d176e6e4f8abd68b2f9637e76496338585213b5636f5"} Dec 05 15:58:50 crc kubenswrapper[4778]: I1205 15:58:50.139343 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xr8gn" podUID="866ee44d-6331-47f7-8452-be6f25815a1e" containerName="registry-server" probeResult="failure" output=< Dec 05 15:58:50 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Dec 05 15:58:50 crc kubenswrapper[4778]: > Dec 05 15:58:50 crc kubenswrapper[4778]: I1205 15:58:50.820032 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fj8nh" podUID="4047fee6-0560-4d41-8212-aa284021dff0" containerName="registry-server" containerID="cri-o://3e9d7d97beb14e503997cc82982bbfc337a58287d0b3f926d663dbba6ab82920" gracePeriod=2 Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.261401 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.360486 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31eaa34f-a155-495d-a833-a54ba9546a1a-utilities\") pod \"31eaa34f-a155-495d-a833-a54ba9546a1a\" (UID: \"31eaa34f-a155-495d-a833-a54ba9546a1a\") " Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.360555 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31eaa34f-a155-495d-a833-a54ba9546a1a-catalog-content\") pod \"31eaa34f-a155-495d-a833-a54ba9546a1a\" (UID: \"31eaa34f-a155-495d-a833-a54ba9546a1a\") " Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.360613 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd46l\" (UniqueName: \"kubernetes.io/projected/31eaa34f-a155-495d-a833-a54ba9546a1a-kube-api-access-zd46l\") pod \"31eaa34f-a155-495d-a833-a54ba9546a1a\" (UID: \"31eaa34f-a155-495d-a833-a54ba9546a1a\") " Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.361259 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31eaa34f-a155-495d-a833-a54ba9546a1a-utilities" (OuterVolumeSpecName: "utilities") pod "31eaa34f-a155-495d-a833-a54ba9546a1a" (UID: "31eaa34f-a155-495d-a833-a54ba9546a1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.365310 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31eaa34f-a155-495d-a833-a54ba9546a1a-kube-api-access-zd46l" (OuterVolumeSpecName: "kube-api-access-zd46l") pod "31eaa34f-a155-495d-a833-a54ba9546a1a" (UID: "31eaa34f-a155-495d-a833-a54ba9546a1a"). InnerVolumeSpecName "kube-api-access-zd46l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.413046 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31eaa34f-a155-495d-a833-a54ba9546a1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31eaa34f-a155-495d-a833-a54ba9546a1a" (UID: "31eaa34f-a155-495d-a833-a54ba9546a1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.461933 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31eaa34f-a155-495d-a833-a54ba9546a1a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.461967 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31eaa34f-a155-495d-a833-a54ba9546a1a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.461980 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd46l\" (UniqueName: \"kubernetes.io/projected/31eaa34f-a155-495d-a833-a54ba9546a1a-kube-api-access-zd46l\") on node \"crc\" DevicePath \"\"" Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.827326 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77q8g" event={"ID":"31eaa34f-a155-495d-a833-a54ba9546a1a","Type":"ContainerDied","Data":"460b27a196277ee289c272fca8c37b495e919e6db4ef5f31d1e0b8794c75d44f"} Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.827451 4778 scope.go:117] "RemoveContainer" containerID="27c64f0021c8b4b20336d176e6e4f8abd68b2f9637e76496338585213b5636f5" Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.827352 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77q8g" Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.829909 4778 generic.go:334] "Generic (PLEG): container finished" podID="4047fee6-0560-4d41-8212-aa284021dff0" containerID="3e9d7d97beb14e503997cc82982bbfc337a58287d0b3f926d663dbba6ab82920" exitCode=0 Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.829948 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj8nh" event={"ID":"4047fee6-0560-4d41-8212-aa284021dff0","Type":"ContainerDied","Data":"3e9d7d97beb14e503997cc82982bbfc337a58287d0b3f926d663dbba6ab82920"} Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.844736 4778 scope.go:117] "RemoveContainer" containerID="4816909baceeb54fbf5b5b537fc486a1e7493b1fe9d81d9f9e2226d40e260b81" Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.861588 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77q8g"] Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.864731 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-77q8g"] Dec 05 15:58:51 crc kubenswrapper[4778]: I1205 15:58:51.874453 4778 scope.go:117] "RemoveContainer" containerID="e599e057b68a85b36e8fea7866e3ee7a41e9cb941e746329fc97003e261beaae" Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.213176 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.272421 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4047fee6-0560-4d41-8212-aa284021dff0-catalog-content\") pod \"4047fee6-0560-4d41-8212-aa284021dff0\" (UID: \"4047fee6-0560-4d41-8212-aa284021dff0\") " Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.272716 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg6q8\" (UniqueName: \"kubernetes.io/projected/4047fee6-0560-4d41-8212-aa284021dff0-kube-api-access-wg6q8\") pod \"4047fee6-0560-4d41-8212-aa284021dff0\" (UID: \"4047fee6-0560-4d41-8212-aa284021dff0\") " Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.272904 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4047fee6-0560-4d41-8212-aa284021dff0-utilities\") pod \"4047fee6-0560-4d41-8212-aa284021dff0\" (UID: \"4047fee6-0560-4d41-8212-aa284021dff0\") " Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.274179 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4047fee6-0560-4d41-8212-aa284021dff0-utilities" (OuterVolumeSpecName: "utilities") pod "4047fee6-0560-4d41-8212-aa284021dff0" (UID: "4047fee6-0560-4d41-8212-aa284021dff0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.278351 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4047fee6-0560-4d41-8212-aa284021dff0-kube-api-access-wg6q8" (OuterVolumeSpecName: "kube-api-access-wg6q8") pod "4047fee6-0560-4d41-8212-aa284021dff0" (UID: "4047fee6-0560-4d41-8212-aa284021dff0"). InnerVolumeSpecName "kube-api-access-wg6q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.299237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4047fee6-0560-4d41-8212-aa284021dff0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4047fee6-0560-4d41-8212-aa284021dff0" (UID: "4047fee6-0560-4d41-8212-aa284021dff0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.374604 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg6q8\" (UniqueName: \"kubernetes.io/projected/4047fee6-0560-4d41-8212-aa284021dff0-kube-api-access-wg6q8\") on node \"crc\" DevicePath \"\"" Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.374847 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4047fee6-0560-4d41-8212-aa284021dff0-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.374910 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4047fee6-0560-4d41-8212-aa284021dff0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.837024 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj8nh" event={"ID":"4047fee6-0560-4d41-8212-aa284021dff0","Type":"ContainerDied","Data":"3855611cbe647602b6a6c843489ce64819971bae0b2bff928275406ae66dde26"} Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.837073 4778 scope.go:117] "RemoveContainer" containerID="3e9d7d97beb14e503997cc82982bbfc337a58287d0b3f926d663dbba6ab82920" Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.837204 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj8nh" Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.855067 4778 scope.go:117] "RemoveContainer" containerID="ea97297539a10c093541d345debc363b59b26911c1f7aba1befd371311b2d154" Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.870147 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj8nh"] Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.874089 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj8nh"] Dec 05 15:58:52 crc kubenswrapper[4778]: I1205 15:58:52.889695 4778 scope.go:117] "RemoveContainer" containerID="3f317cb916b5a76c33c3c3f7528610da54934e45626536ec1a4c1232511fef21" Dec 05 15:58:53 crc kubenswrapper[4778]: I1205 15:58:53.258576 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31eaa34f-a155-495d-a833-a54ba9546a1a" path="/var/lib/kubelet/pods/31eaa34f-a155-495d-a833-a54ba9546a1a/volumes" Dec 05 15:58:53 crc kubenswrapper[4778]: I1205 15:58:53.259595 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4047fee6-0560-4d41-8212-aa284021dff0" path="/var/lib/kubelet/pods/4047fee6-0560-4d41-8212-aa284021dff0/volumes" Dec 05 15:58:55 crc kubenswrapper[4778]: I1205 15:58:55.535197 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:58:55 crc kubenswrapper[4778]: I1205 15:58:55.535706 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:58:55 crc kubenswrapper[4778]: I1205 15:58:55.608940 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:58:55 crc kubenswrapper[4778]: I1205 15:58:55.689104 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:58:55 crc kubenswrapper[4778]: I1205 15:58:55.689410 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:58:55 crc kubenswrapper[4778]: I1205 15:58:55.725430 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:58:55 crc kubenswrapper[4778]: I1205 15:58:55.899238 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wkk25" Dec 05 15:58:55 crc kubenswrapper[4778]: I1205 15:58:55.913741 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-plzrl" Dec 05 15:58:55 crc kubenswrapper[4778]: I1205 15:58:55.930666 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:58:55 crc kubenswrapper[4778]: I1205 15:58:55.930717 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:58:55 crc kubenswrapper[4778]: I1205 15:58:55.984026 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:58:56 crc kubenswrapper[4778]: I1205 15:58:56.905771 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:58:57 crc kubenswrapper[4778]: I1205 15:58:57.559466 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 15:58:58 crc kubenswrapper[4778]: I1205 15:58:58.770503 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:58:58 crc kubenswrapper[4778]: I1205 15:58:58.825503 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 15:58:58 crc kubenswrapper[4778]: I1205 15:58:58.960719 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7jmzd"] Dec 05 15:58:58 crc kubenswrapper[4778]: I1205 15:58:58.960968 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7jmzd" podUID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" containerName="registry-server" containerID="cri-o://af7ad15e913ea74bb55f9c4267abbd23e823e4363a160dd91cd56f40ce594b4f" gracePeriod=2 Dec 05 15:58:59 crc kubenswrapper[4778]: I1205 15:58:59.142139 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:58:59 crc kubenswrapper[4778]: I1205 15:58:59.182622 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:58:59 crc kubenswrapper[4778]: I1205 15:58:59.491183 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ln57b"] Dec 05 15:58:59 crc kubenswrapper[4778]: I1205 15:58:59.879521 4778 generic.go:334] "Generic (PLEG): container finished" podID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" containerID="af7ad15e913ea74bb55f9c4267abbd23e823e4363a160dd91cd56f40ce594b4f" exitCode=0 Dec 05 15:58:59 crc kubenswrapper[4778]: I1205 15:58:59.879558 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jmzd" event={"ID":"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa","Type":"ContainerDied","Data":"af7ad15e913ea74bb55f9c4267abbd23e823e4363a160dd91cd56f40ce594b4f"} Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.641950 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.795226 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzhfp\" (UniqueName: \"kubernetes.io/projected/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-kube-api-access-lzhfp\") pod \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\" (UID: \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\") " Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.795318 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-catalog-content\") pod \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\" (UID: \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\") " Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.796348 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-utilities\") pod \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\" (UID: \"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa\") " Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.797124 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-utilities" (OuterVolumeSpecName: "utilities") pod "840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" (UID: "840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.800466 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-kube-api-access-lzhfp" (OuterVolumeSpecName: "kube-api-access-lzhfp") pod "840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" (UID: "840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa"). InnerVolumeSpecName "kube-api-access-lzhfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.844532 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" (UID: "840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.887876 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jmzd" event={"ID":"840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa","Type":"ContainerDied","Data":"5c1755539a4ebf1abef2b256aa9c17a45ace27074ec2b15ba1afb202c10429d2"} Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.887931 4778 scope.go:117] "RemoveContainer" containerID="af7ad15e913ea74bb55f9c4267abbd23e823e4363a160dd91cd56f40ce594b4f" Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.887948 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jmzd" Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.898061 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.898096 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzhfp\" (UniqueName: \"kubernetes.io/projected/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-kube-api-access-lzhfp\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.898110 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.910729 4778 scope.go:117] "RemoveContainer" containerID="07e27d6e1873a5a1c47e56d56454afe6a87a3370eb73f67753e0528c6171eeb1" Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.923180 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7jmzd"] Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.933413 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7jmzd"] Dec 05 15:59:00 crc kubenswrapper[4778]: I1205 15:59:00.949825 4778 scope.go:117] "RemoveContainer" containerID="bb145dd12b0ccfece3846efa5388f9eac39608fdeb5ffc38eace30e4ddd73ed9" Dec 05 15:59:01 crc kubenswrapper[4778]: I1205 15:59:01.257272 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" path="/var/lib/kubelet/pods/840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa/volumes" Dec 05 15:59:01 crc kubenswrapper[4778]: I1205 15:59:01.975263 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xr8gn"] Dec 05 15:59:01 crc kubenswrapper[4778]: I1205 15:59:01.975792 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xr8gn" podUID="866ee44d-6331-47f7-8452-be6f25815a1e" containerName="registry-server" containerID="cri-o://40853ae3f7aa925dc41cfad39bac91f0c9ef6104f317b0f69e4808fcafc6740f" gracePeriod=2 Dec 05 15:59:06 crc kubenswrapper[4778]: I1205 15:59:06.449154 4778 generic.go:334] "Generic (PLEG): container finished" podID="866ee44d-6331-47f7-8452-be6f25815a1e" containerID="40853ae3f7aa925dc41cfad39bac91f0c9ef6104f317b0f69e4808fcafc6740f" exitCode=0 Dec 05 15:59:06 crc kubenswrapper[4778]: I1205 15:59:06.449515 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr8gn" event={"ID":"866ee44d-6331-47f7-8452-be6f25815a1e","Type":"ContainerDied","Data":"40853ae3f7aa925dc41cfad39bac91f0c9ef6104f317b0f69e4808fcafc6740f"} Dec 05 15:59:06 crc kubenswrapper[4778]: I1205 15:59:06.593802 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:59:06 crc kubenswrapper[4778]: I1205 15:59:06.670659 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/866ee44d-6331-47f7-8452-be6f25815a1e-utilities\") pod \"866ee44d-6331-47f7-8452-be6f25815a1e\" (UID: \"866ee44d-6331-47f7-8452-be6f25815a1e\") " Dec 05 15:59:06 crc kubenswrapper[4778]: I1205 15:59:06.670725 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6xht\" (UniqueName: \"kubernetes.io/projected/866ee44d-6331-47f7-8452-be6f25815a1e-kube-api-access-x6xht\") pod \"866ee44d-6331-47f7-8452-be6f25815a1e\" (UID: \"866ee44d-6331-47f7-8452-be6f25815a1e\") " Dec 05 15:59:06 crc kubenswrapper[4778]: I1205 15:59:06.670797 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/866ee44d-6331-47f7-8452-be6f25815a1e-catalog-content\") pod \"866ee44d-6331-47f7-8452-be6f25815a1e\" (UID: \"866ee44d-6331-47f7-8452-be6f25815a1e\") " Dec 05 15:59:06 crc kubenswrapper[4778]: I1205 15:59:06.672591 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/866ee44d-6331-47f7-8452-be6f25815a1e-utilities" (OuterVolumeSpecName: "utilities") pod "866ee44d-6331-47f7-8452-be6f25815a1e" (UID: "866ee44d-6331-47f7-8452-be6f25815a1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:59:06 crc kubenswrapper[4778]: I1205 15:59:06.676628 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/866ee44d-6331-47f7-8452-be6f25815a1e-kube-api-access-x6xht" (OuterVolumeSpecName: "kube-api-access-x6xht") pod "866ee44d-6331-47f7-8452-be6f25815a1e" (UID: "866ee44d-6331-47f7-8452-be6f25815a1e"). InnerVolumeSpecName "kube-api-access-x6xht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:59:06 crc kubenswrapper[4778]: I1205 15:59:06.772337 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6xht\" (UniqueName: \"kubernetes.io/projected/866ee44d-6331-47f7-8452-be6f25815a1e-kube-api-access-x6xht\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:06 crc kubenswrapper[4778]: I1205 15:59:06.772386 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/866ee44d-6331-47f7-8452-be6f25815a1e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:06 crc kubenswrapper[4778]: I1205 15:59:06.785532 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/866ee44d-6331-47f7-8452-be6f25815a1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "866ee44d-6331-47f7-8452-be6f25815a1e" (UID: "866ee44d-6331-47f7-8452-be6f25815a1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 15:59:06 crc kubenswrapper[4778]: I1205 15:59:06.873251 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/866ee44d-6331-47f7-8452-be6f25815a1e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:07 crc kubenswrapper[4778]: I1205 15:59:07.461499 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr8gn" event={"ID":"866ee44d-6331-47f7-8452-be6f25815a1e","Type":"ContainerDied","Data":"0edf935d4e0b5ae7ecfb618d388746b5db541cfde2678792396bc4f885ad184f"} Dec 05 15:59:07 crc kubenswrapper[4778]: I1205 15:59:07.461791 4778 scope.go:117] "RemoveContainer" containerID="40853ae3f7aa925dc41cfad39bac91f0c9ef6104f317b0f69e4808fcafc6740f" Dec 05 15:59:07 crc kubenswrapper[4778]: I1205 15:59:07.461636 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xr8gn" Dec 05 15:59:07 crc kubenswrapper[4778]: I1205 15:59:07.483285 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xr8gn"] Dec 05 15:59:07 crc kubenswrapper[4778]: I1205 15:59:07.486869 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xr8gn"] Dec 05 15:59:07 crc kubenswrapper[4778]: I1205 15:59:07.497195 4778 scope.go:117] "RemoveContainer" containerID="a0da7c4d8e7e54594440893ea856df165eedd6b87a2d303eeb67ea419cfa706f" Dec 05 15:59:07 crc kubenswrapper[4778]: I1205 15:59:07.515418 4778 scope.go:117] "RemoveContainer" containerID="a9fa942803b6fb651824147607800240a3ed86de42fb24b84e3c2d159293a8c5" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.256660 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="866ee44d-6331-47f7-8452-be6f25815a1e" path="/var/lib/kubelet/pods/866ee44d-6331-47f7-8452-be6f25815a1e/volumes" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.301359 4778 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.301897 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63" gracePeriod=15 Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.301958 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907" gracePeriod=15 Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.302002 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5" gracePeriod=15 Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.302096 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258" gracePeriod=15 Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.302182 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8" gracePeriod=15 Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.302673 4778 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.302898 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.302911 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.302921 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.302927 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.302938 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" containerName="extract-content" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.302944 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" containerName="extract-content" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.302952 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.302957 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.302967 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866ee44d-6331-47f7-8452-be6f25815a1e" containerName="extract-utilities" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.302973 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="866ee44d-6331-47f7-8452-be6f25815a1e" containerName="extract-utilities" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.302981 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" containerName="registry-server" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.302987 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" containerName="registry-server" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.302995 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" containerName="extract-utilities" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303000 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" containerName="extract-utilities" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303007 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4047fee6-0560-4d41-8212-aa284021dff0" containerName="extract-utilities" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303012 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4047fee6-0560-4d41-8212-aa284021dff0" containerName="extract-utilities" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303021 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303026 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303033 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303039 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303046 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31eaa34f-a155-495d-a833-a54ba9546a1a" containerName="extract-content" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303053 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="31eaa34f-a155-495d-a833-a54ba9546a1a" containerName="extract-content" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303060 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866ee44d-6331-47f7-8452-be6f25815a1e" containerName="registry-server" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303066 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="866ee44d-6331-47f7-8452-be6f25815a1e" containerName="registry-server" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303074 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303080 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303088 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e6f140-3f61-4e58-90fb-cc06884e541e" containerName="pruner" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303094 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e6f140-3f61-4e58-90fb-cc06884e541e" containerName="pruner" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303102 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4047fee6-0560-4d41-8212-aa284021dff0" containerName="registry-server" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303109 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4047fee6-0560-4d41-8212-aa284021dff0" containerName="registry-server" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303117 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31eaa34f-a155-495d-a833-a54ba9546a1a" containerName="extract-utilities" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303122 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="31eaa34f-a155-495d-a833-a54ba9546a1a" containerName="extract-utilities" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303130 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4047fee6-0560-4d41-8212-aa284021dff0" containerName="extract-content" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303136 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4047fee6-0560-4d41-8212-aa284021dff0" containerName="extract-content" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303144 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303150 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303159 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866ee44d-6331-47f7-8452-be6f25815a1e" containerName="extract-content" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303165 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="866ee44d-6331-47f7-8452-be6f25815a1e" containerName="extract-content" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.303172 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31eaa34f-a155-495d-a833-a54ba9546a1a" containerName="registry-server" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303177 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="31eaa34f-a155-495d-a833-a54ba9546a1a" containerName="registry-server" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303274 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303284 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="840cfa6b-b6e9-4eb0-a1ca-05d34003f9fa" containerName="registry-server" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303291 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303301 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="31eaa34f-a155-495d-a833-a54ba9546a1a" containerName="registry-server" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303308 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303315 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="866ee44d-6331-47f7-8452-be6f25815a1e" containerName="registry-server" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303339 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e6f140-3f61-4e58-90fb-cc06884e541e" containerName="pruner" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303346 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303353 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4047fee6-0560-4d41-8212-aa284021dff0" containerName="registry-server" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303380 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.303532 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.304496 4778 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.305175 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.311096 4778 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.345235 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.407240 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.407345 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.407406 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.407451 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.407513 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.407539 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.407565 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.407587 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.474556 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.475757 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.476630 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8" exitCode=0 Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.476664 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5" exitCode=0 Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.476674 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63" exitCode=0 Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.476681 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907" exitCode=2 Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.476734 4778 scope.go:117] "RemoveContainer" containerID="f8e6da00aa8a779d98f720e46252f6e4d7e253b6fc0c765667236f74d81886fe" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508640 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508689 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508720 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508745 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508776 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508793 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508811 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508826 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508742 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508845 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508872 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508851 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508889 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508914 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508923 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.508948 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: I1205 15:59:09.640663 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:09 crc kubenswrapper[4778]: W1205 15:59:09.657932 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-0a8630e115542fa23306d801a5ce8e5feacb3afb52048f6ddaa08c886c486d67 WatchSource:0}: Error finding container 0a8630e115542fa23306d801a5ce8e5feacb3afb52048f6ddaa08c886c486d67: Status 404 returned error can't find the container with id 0a8630e115542fa23306d801a5ce8e5feacb3afb52048f6ddaa08c886c486d67 Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.660923 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e5cfdedadd458 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 15:59:09.65988668 +0000 UTC m=+236.763683100,LastTimestamp:2025-12-05 15:59:09.65988668 +0000 UTC m=+236.763683100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.757437 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:59:09Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:59:09Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:59:09Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:59:09Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:15adb3b2133604b064893f8009a74145e4c8bb5b134d111346dcccbdd2aa9bc2\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:164fc35a19aa6cc886c8015c8ee3eba4895e76b1152cb9d795e4f3154a8533a3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610512706},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:5fa3697ca41dc0fe44242792bdef7c0137edb9819758979f604772507a87ccb8\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7054e60ca9b1cc587cfbfb7006c524786d20df04cfccd0e0b7e31117fd45b649\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221145934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:3caf7df6b2c9e0063e3a37bebe5f6a88791aa823fc9edf4420146d6980906dd3\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ca2de14d6541ba5e846fa86ab3093192e5dd576acf08e70ef6f537ba5462e555\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201614156},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1b1026c62413fa239fa4ff6541fe8bda656c1281867ad6ee2c848feccb13c97e\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:2b633ebdc901d19290af4dc2d09e2b59c504c0fc15a3fba410b0ce098e2d5753\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1141987142},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.758323 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.758810 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.759287 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.759698 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:09 crc kubenswrapper[4778]: E1205 15:59:09.759750 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 15:59:10 crc kubenswrapper[4778]: I1205 15:59:10.483481 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"70022343adb7501fd2d69e4156619607baf93f11912945512e2dcac0cfe126e1"} Dec 05 15:59:10 crc kubenswrapper[4778]: I1205 15:59:10.483523 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0a8630e115542fa23306d801a5ce8e5feacb3afb52048f6ddaa08c886c486d67"} Dec 05 15:59:10 crc kubenswrapper[4778]: I1205 15:59:10.484405 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:10 crc kubenswrapper[4778]: I1205 15:59:10.487230 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 15:59:10 crc kubenswrapper[4778]: I1205 15:59:10.490429 4778 generic.go:334] "Generic (PLEG): container finished" podID="b2ebd611-e2d6-45da-a673-29cae7deb0c1" containerID="83b789da940bddb5bfb016b94078beb1b5deddcb5792a2e6b12a4e8633948725" exitCode=0 Dec 05 15:59:10 crc kubenswrapper[4778]: I1205 15:59:10.490464 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b2ebd611-e2d6-45da-a673-29cae7deb0c1","Type":"ContainerDied","Data":"83b789da940bddb5bfb016b94078beb1b5deddcb5792a2e6b12a4e8633948725"} Dec 05 15:59:10 crc kubenswrapper[4778]: I1205 15:59:10.490936 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:10 crc kubenswrapper[4778]: I1205 15:59:10.491471 4778 status_manager.go:851] "Failed to get status for pod" podUID="b2ebd611-e2d6-45da-a673-29cae7deb0c1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.780897 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.782472 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.783402 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.783691 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.784269 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.784632 4778 status_manager.go:851] "Failed to get status for pod" podUID="b2ebd611-e2d6-45da-a673-29cae7deb0c1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.785084 4778 status_manager.go:851] "Failed to get status for pod" podUID="b2ebd611-e2d6-45da-a673-29cae7deb0c1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.785405 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.785917 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.844421 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.844467 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.844510 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.844527 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.844597 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.844640 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.844950 4778 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.844978 4778 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.844989 4778 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.946410 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2ebd611-e2d6-45da-a673-29cae7deb0c1-kubelet-dir\") pod \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\" (UID: \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\") " Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.946517 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2ebd611-e2d6-45da-a673-29cae7deb0c1-kube-api-access\") pod \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\" (UID: \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\") " Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.946547 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2ebd611-e2d6-45da-a673-29cae7deb0c1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b2ebd611-e2d6-45da-a673-29cae7deb0c1" (UID: "b2ebd611-e2d6-45da-a673-29cae7deb0c1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.946580 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b2ebd611-e2d6-45da-a673-29cae7deb0c1-var-lock\") pod \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\" (UID: \"b2ebd611-e2d6-45da-a673-29cae7deb0c1\") " Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.946630 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2ebd611-e2d6-45da-a673-29cae7deb0c1-var-lock" (OuterVolumeSpecName: "var-lock") pod "b2ebd611-e2d6-45da-a673-29cae7deb0c1" (UID: "b2ebd611-e2d6-45da-a673-29cae7deb0c1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.947111 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2ebd611-e2d6-45da-a673-29cae7deb0c1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.947149 4778 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b2ebd611-e2d6-45da-a673-29cae7deb0c1-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:11 crc kubenswrapper[4778]: I1205 15:59:11.955257 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ebd611-e2d6-45da-a673-29cae7deb0c1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b2ebd611-e2d6-45da-a673-29cae7deb0c1" (UID: "b2ebd611-e2d6-45da-a673-29cae7deb0c1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.047940 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2ebd611-e2d6-45da-a673-29cae7deb0c1-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:12 crc kubenswrapper[4778]: E1205 15:59:12.476994 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e5cfdedadd458 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 15:59:09.65988668 +0000 UTC m=+236.763683100,LastTimestamp:2025-12-05 15:59:09.65988668 +0000 UTC m=+236.763683100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.506031 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.507556 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258" exitCode=0 Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.507634 4778 scope.go:117] "RemoveContainer" containerID="8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.507731 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.510160 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b2ebd611-e2d6-45da-a673-29cae7deb0c1","Type":"ContainerDied","Data":"b3a884fa8e4a33d77c8d8c6538d591074fd713b7a70fcd8a2cfe6fe2f15c57c0"} Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.510186 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3a884fa8e4a33d77c8d8c6538d591074fd713b7a70fcd8a2cfe6fe2f15c57c0" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.510289 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.525533 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.526175 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.526661 4778 status_manager.go:851] "Failed to get status for pod" podUID="b2ebd611-e2d6-45da-a673-29cae7deb0c1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.528673 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.529186 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.529915 4778 scope.go:117] "RemoveContainer" containerID="976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.531121 4778 status_manager.go:851] "Failed to get status for pod" podUID="b2ebd611-e2d6-45da-a673-29cae7deb0c1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.545685 4778 scope.go:117] "RemoveContainer" containerID="ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.565021 4778 scope.go:117] "RemoveContainer" containerID="9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.578682 4778 scope.go:117] "RemoveContainer" containerID="336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.594980 4778 scope.go:117] "RemoveContainer" containerID="e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.624665 4778 scope.go:117] "RemoveContainer" containerID="8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8" Dec 05 15:59:12 crc kubenswrapper[4778]: E1205 15:59:12.625178 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\": container with ID starting with 8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8 not found: ID does not exist" containerID="8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.625218 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8"} err="failed to get container status \"8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\": rpc error: code = NotFound desc = could not find container \"8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8\": container with ID starting with 8c91e5cfe1c905f9fce3cb4177ede13dc6bd85381b894b7ad438251e964ef4e8 not found: ID does not exist" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.625238 4778 scope.go:117] "RemoveContainer" containerID="976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5" Dec 05 15:59:12 crc kubenswrapper[4778]: E1205 15:59:12.625568 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\": container with ID starting with 976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5 not found: ID does not exist" containerID="976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.625604 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5"} err="failed to get container status \"976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\": rpc error: code = NotFound desc = could not find container \"976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5\": container with ID starting with 976448dbc0d01b825901a9b351544c75ae6995a91b610e80df0fdbf47ba983e5 not found: ID does not exist" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.625645 4778 scope.go:117] "RemoveContainer" containerID="ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63" Dec 05 15:59:12 crc kubenswrapper[4778]: E1205 15:59:12.625864 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\": container with ID starting with ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63 not found: ID does not exist" containerID="ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.625886 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63"} err="failed to get container status \"ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\": rpc error: code = NotFound desc = could not find container \"ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63\": container with ID starting with ae18451eed5f40df700636d4d9cadf176e44179beff6aa12bdf4429d463d0f63 not found: ID does not exist" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.625899 4778 scope.go:117] "RemoveContainer" containerID="9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907" Dec 05 15:59:12 crc kubenswrapper[4778]: E1205 15:59:12.626080 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\": container with ID starting with 9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907 not found: ID does not exist" containerID="9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.626105 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907"} err="failed to get container status \"9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\": rpc error: code = NotFound desc = could not find container \"9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907\": container with ID starting with 9812415954a4667b2e5f0b1b9e49debad7a2c025b8bbd6363ff777525ea1e907 not found: ID does not exist" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.626119 4778 scope.go:117] "RemoveContainer" containerID="336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258" Dec 05 15:59:12 crc kubenswrapper[4778]: E1205 15:59:12.626339 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\": container with ID starting with 336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258 not found: ID does not exist" containerID="336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.626381 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258"} err="failed to get container status \"336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\": rpc error: code = NotFound desc = could not find container \"336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258\": container with ID starting with 336476a2b15fdea72d218a2e138979a0292bfa844ca522fc4058ca1fddd9d258 not found: ID does not exist" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.626400 4778 scope.go:117] "RemoveContainer" containerID="e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d" Dec 05 15:59:12 crc kubenswrapper[4778]: E1205 15:59:12.626647 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\": container with ID starting with e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d not found: ID does not exist" containerID="e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d" Dec 05 15:59:12 crc kubenswrapper[4778]: I1205 15:59:12.626673 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d"} err="failed to get container status \"e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\": rpc error: code = NotFound desc = could not find container \"e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d\": container with ID starting with e050dd88f9d6ed41a1c5c1da663122a69dca1514382fd027316bc9cee5ce782d not found: ID does not exist" Dec 05 15:59:13 crc kubenswrapper[4778]: I1205 15:59:13.251147 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:13 crc kubenswrapper[4778]: I1205 15:59:13.251384 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:13 crc kubenswrapper[4778]: I1205 15:59:13.251607 4778 status_manager.go:851] "Failed to get status for pod" podUID="b2ebd611-e2d6-45da-a673-29cae7deb0c1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:13 crc kubenswrapper[4778]: I1205 15:59:13.273437 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 05 15:59:18 crc kubenswrapper[4778]: E1205 15:59:18.961826 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:18 crc kubenswrapper[4778]: E1205 15:59:18.963473 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:18 crc kubenswrapper[4778]: E1205 15:59:18.964428 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:18 crc kubenswrapper[4778]: E1205 15:59:18.965158 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:18 crc kubenswrapper[4778]: E1205 15:59:18.965757 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:18 crc kubenswrapper[4778]: I1205 15:59:18.965814 4778 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 15:59:18 crc kubenswrapper[4778]: E1205 15:59:18.966274 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="200ms" Dec 05 15:59:19 crc kubenswrapper[4778]: E1205 15:59:19.167007 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="400ms" Dec 05 15:59:19 crc kubenswrapper[4778]: E1205 15:59:19.568933 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="800ms" Dec 05 15:59:20 crc kubenswrapper[4778]: E1205 15:59:20.007851 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:59:20Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:59:20Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:59:20Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T15:59:20Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:15adb3b2133604b064893f8009a74145e4c8bb5b134d111346dcccbdd2aa9bc2\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:164fc35a19aa6cc886c8015c8ee3eba4895e76b1152cb9d795e4f3154a8533a3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610512706},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:5fa3697ca41dc0fe44242792bdef7c0137edb9819758979f604772507a87ccb8\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7054e60ca9b1cc587cfbfb7006c524786d20df04cfccd0e0b7e31117fd45b649\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221145934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:3caf7df6b2c9e0063e3a37bebe5f6a88791aa823fc9edf4420146d6980906dd3\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ca2de14d6541ba5e846fa86ab3093192e5dd576acf08e70ef6f537ba5462e555\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201614156},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1b1026c62413fa239fa4ff6541fe8bda656c1281867ad6ee2c848feccb13c97e\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:2b633ebdc901d19290af4dc2d09e2b59c504c0fc15a3fba410b0ce098e2d5753\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1141987142},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:20 crc kubenswrapper[4778]: E1205 15:59:20.009456 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:20 crc kubenswrapper[4778]: E1205 15:59:20.009983 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:20 crc kubenswrapper[4778]: E1205 15:59:20.010298 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:20 crc kubenswrapper[4778]: E1205 15:59:20.011120 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:20 crc kubenswrapper[4778]: E1205 15:59:20.011159 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 15:59:20 crc kubenswrapper[4778]: I1205 15:59:20.248810 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:20 crc kubenswrapper[4778]: I1205 15:59:20.249761 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:20 crc kubenswrapper[4778]: I1205 15:59:20.250401 4778 status_manager.go:851] "Failed to get status for pod" podUID="b2ebd611-e2d6-45da-a673-29cae7deb0c1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:20 crc kubenswrapper[4778]: I1205 15:59:20.264509 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fd281bf-fe41-407b-b13c-5392ccd67b5d" Dec 05 15:59:20 crc kubenswrapper[4778]: I1205 15:59:20.264548 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fd281bf-fe41-407b-b13c-5392ccd67b5d" Dec 05 15:59:20 crc kubenswrapper[4778]: E1205 15:59:20.265038 4778 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:20 crc kubenswrapper[4778]: I1205 15:59:20.265651 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:20 crc kubenswrapper[4778]: E1205 15:59:20.369740 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="1.6s" Dec 05 15:59:20 crc kubenswrapper[4778]: I1205 15:59:20.558087 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3681cf046a05465024144b3b15ba10348b30f9955cc073548e37f20e18d6e078"} Dec 05 15:59:21 crc kubenswrapper[4778]: I1205 15:59:21.568187 4778 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="805b1a4157cc366fe0f39fb667575c9f077504089dd150540190d63bf056a26c" exitCode=0 Dec 05 15:59:21 crc kubenswrapper[4778]: I1205 15:59:21.568241 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"805b1a4157cc366fe0f39fb667575c9f077504089dd150540190d63bf056a26c"} Dec 05 15:59:21 crc kubenswrapper[4778]: I1205 15:59:21.568605 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fd281bf-fe41-407b-b13c-5392ccd67b5d" Dec 05 15:59:21 crc kubenswrapper[4778]: I1205 15:59:21.568623 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fd281bf-fe41-407b-b13c-5392ccd67b5d" Dec 05 15:59:21 crc kubenswrapper[4778]: E1205 15:59:21.568996 4778 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:21 crc kubenswrapper[4778]: I1205 15:59:21.569044 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:21 crc kubenswrapper[4778]: I1205 15:59:21.569510 4778 status_manager.go:851] "Failed to get status for pod" podUID="b2ebd611-e2d6-45da-a673-29cae7deb0c1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Dec 05 15:59:21 crc kubenswrapper[4778]: E1205 15:59:21.970730 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="3.2s" Dec 05 15:59:22 crc kubenswrapper[4778]: I1205 15:59:22.615902 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4b0f3f2837cd8b8619e7b5a910b61ee853938f784384f119ebf78b22975c41af"} Dec 05 15:59:22 crc kubenswrapper[4778]: I1205 15:59:22.615951 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e760475c3c78162373a9e1fc9dcccf352bbb6ffa8d744aca0ddbffcb8d731849"} Dec 05 15:59:23 crc kubenswrapper[4778]: I1205 15:59:23.624349 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d9027ec6d3987099cc3172f50e8a30ba6bff6edfec7db3e40ef2119c53abdc42"} Dec 05 15:59:23 crc kubenswrapper[4778]: I1205 15:59:23.624406 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e64d80e7dc264f5fad3a2304d0fa921e7d0804d5d8ec6cfe429114eb63ab7123"} Dec 05 15:59:23 crc kubenswrapper[4778]: I1205 15:59:23.624418 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"11f640844ad79ba1555683bc45eb2381198c867d96c66c66c3124f04ee691b2b"} Dec 05 15:59:23 crc kubenswrapper[4778]: I1205 15:59:23.624513 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:23 crc kubenswrapper[4778]: I1205 15:59:23.624604 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fd281bf-fe41-407b-b13c-5392ccd67b5d" Dec 05 15:59:23 crc kubenswrapper[4778]: I1205 15:59:23.624629 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fd281bf-fe41-407b-b13c-5392ccd67b5d" Dec 05 15:59:23 crc kubenswrapper[4778]: I1205 15:59:23.626389 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 15:59:23 crc kubenswrapper[4778]: I1205 15:59:23.626424 4778 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e" exitCode=1 Dec 05 15:59:23 crc kubenswrapper[4778]: I1205 15:59:23.626443 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e"} Dec 05 15:59:23 crc kubenswrapper[4778]: I1205 15:59:23.626729 4778 scope.go:117] "RemoveContainer" containerID="41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e" Dec 05 15:59:24 crc kubenswrapper[4778]: I1205 15:59:24.528629 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" podUID="21e078d9-a539-4626-b30f-908b8e866a7a" containerName="oauth-openshift" containerID="cri-o://ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf" gracePeriod=15 Dec 05 15:59:24 crc kubenswrapper[4778]: I1205 15:59:24.608315 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:59:24 crc kubenswrapper[4778]: I1205 15:59:24.635611 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 15:59:24 crc kubenswrapper[4778]: I1205 15:59:24.635713 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1af4e0ce04400b6434bdaaaddc87daebb09c4ec7e5662c7f230af4c6897ede6f"} Dec 05 15:59:24 crc kubenswrapper[4778]: I1205 15:59:24.995986 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.135634 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-session\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.135719 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21e078d9-a539-4626-b30f-908b8e866a7a-audit-dir\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.135755 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbrlh\" (UniqueName: \"kubernetes.io/projected/21e078d9-a539-4626-b30f-908b8e866a7a-kube-api-access-gbrlh\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.135778 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-ocp-branding-template\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.135824 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-serving-cert\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.135883 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-cliconfig\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.135907 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-service-ca\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.135931 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-idp-0-file-data\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.135964 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-trusted-ca-bundle\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.135990 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-audit-policies\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.136023 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-error\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.136098 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-router-certs\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.136139 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-login\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.136181 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-provider-selection\") pod \"21e078d9-a539-4626-b30f-908b8e866a7a\" (UID: \"21e078d9-a539-4626-b30f-908b8e866a7a\") " Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.137087 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.137874 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21e078d9-a539-4626-b30f-908b8e866a7a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.137927 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.137971 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.138313 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.142619 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.143273 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.143633 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e078d9-a539-4626-b30f-908b8e866a7a-kube-api-access-gbrlh" (OuterVolumeSpecName: "kube-api-access-gbrlh") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "kube-api-access-gbrlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.143757 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.144397 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.144576 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.144959 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.145474 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.148310 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "21e078d9-a539-4626-b30f-908b8e866a7a" (UID: "21e078d9-a539-4626-b30f-908b8e866a7a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237407 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237460 4778 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237483 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237502 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237521 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237540 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237621 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237640 4778 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21e078d9-a539-4626-b30f-908b8e866a7a-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237660 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbrlh\" (UniqueName: \"kubernetes.io/projected/21e078d9-a539-4626-b30f-908b8e866a7a-kube-api-access-gbrlh\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237677 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237697 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237717 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237734 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.237752 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/21e078d9-a539-4626-b30f-908b8e866a7a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.266074 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.266134 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.275462 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.645002 4778 generic.go:334] "Generic (PLEG): container finished" podID="21e078d9-a539-4626-b30f-908b8e866a7a" containerID="ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf" exitCode=0 Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.645111 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.645106 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" event={"ID":"21e078d9-a539-4626-b30f-908b8e866a7a","Type":"ContainerDied","Data":"ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf"} Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.645318 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ln57b" event={"ID":"21e078d9-a539-4626-b30f-908b8e866a7a","Type":"ContainerDied","Data":"ba940eec518780bc4b7c9d0da0af3f0530c8b3211a25d1df1d64bb2c539478db"} Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.645395 4778 scope.go:117] "RemoveContainer" containerID="ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.666110 4778 scope.go:117] "RemoveContainer" containerID="ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf" Dec 05 15:59:25 crc kubenswrapper[4778]: E1205 15:59:25.666692 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf\": container with ID starting with ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf not found: ID does not exist" containerID="ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf" Dec 05 15:59:25 crc kubenswrapper[4778]: I1205 15:59:25.666753 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf"} err="failed to get container status \"ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf\": rpc error: code = NotFound desc = could not find container \"ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf\": container with ID starting with ef13fb3a134b9f184c31da21d4ba6d0eef7bbf1d52a275b9f76854f58d9eb0cf not found: ID does not exist" Dec 05 15:59:26 crc kubenswrapper[4778]: I1205 15:59:26.437822 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:59:28 crc kubenswrapper[4778]: I1205 15:59:28.635495 4778 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:28 crc kubenswrapper[4778]: I1205 15:59:28.666932 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fd281bf-fe41-407b-b13c-5392ccd67b5d" Dec 05 15:59:28 crc kubenswrapper[4778]: I1205 15:59:28.666964 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fd281bf-fe41-407b-b13c-5392ccd67b5d" Dec 05 15:59:28 crc kubenswrapper[4778]: I1205 15:59:28.674326 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:28 crc kubenswrapper[4778]: I1205 15:59:28.676866 4778 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="cdea7ac8-02f5-4873-8dca-b547119a3707" Dec 05 15:59:29 crc kubenswrapper[4778]: I1205 15:59:29.672581 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fd281bf-fe41-407b-b13c-5392ccd67b5d" Dec 05 15:59:29 crc kubenswrapper[4778]: I1205 15:59:29.672935 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fd281bf-fe41-407b-b13c-5392ccd67b5d" Dec 05 15:59:32 crc kubenswrapper[4778]: I1205 15:59:32.542169 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:59:32 crc kubenswrapper[4778]: I1205 15:59:32.542423 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 15:59:32 crc kubenswrapper[4778]: I1205 15:59:32.542466 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 15:59:33 crc kubenswrapper[4778]: I1205 15:59:33.258219 4778 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="cdea7ac8-02f5-4873-8dca-b547119a3707" Dec 05 15:59:38 crc kubenswrapper[4778]: I1205 15:59:38.810788 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 15:59:39 crc kubenswrapper[4778]: I1205 15:59:39.306077 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 15:59:39 crc kubenswrapper[4778]: I1205 15:59:39.598103 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 15:59:39 crc kubenswrapper[4778]: I1205 15:59:39.764218 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 15:59:39 crc kubenswrapper[4778]: I1205 15:59:39.827924 4778 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 15:59:40 crc kubenswrapper[4778]: I1205 15:59:40.022084 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 15:59:40 crc kubenswrapper[4778]: I1205 15:59:40.212136 4778 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 15:59:40 crc kubenswrapper[4778]: I1205 15:59:40.212835 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=31.212814166 podStartE2EDuration="31.212814166s" podCreationTimestamp="2025-12-05 15:59:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:59:28.422809516 +0000 UTC m=+255.526605956" watchObservedRunningTime="2025-12-05 15:59:40.212814166 +0000 UTC m=+267.316610576" Dec 05 15:59:40 crc kubenswrapper[4778]: I1205 15:59:40.219071 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-ln57b"] Dec 05 15:59:40 crc kubenswrapper[4778]: I1205 15:59:40.219167 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 15:59:40 crc kubenswrapper[4778]: I1205 15:59:40.219900 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fd281bf-fe41-407b-b13c-5392ccd67b5d" Dec 05 15:59:40 crc kubenswrapper[4778]: I1205 15:59:40.219957 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fd281bf-fe41-407b-b13c-5392ccd67b5d" Dec 05 15:59:40 crc kubenswrapper[4778]: I1205 15:59:40.226653 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 15:59:40 crc kubenswrapper[4778]: I1205 15:59:40.250859 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=12.250832579 podStartE2EDuration="12.250832579s" podCreationTimestamp="2025-12-05 15:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:59:40.249106641 +0000 UTC m=+267.352903051" watchObservedRunningTime="2025-12-05 15:59:40.250832579 +0000 UTC m=+267.354628989" Dec 05 15:59:40 crc kubenswrapper[4778]: I1205 15:59:40.427830 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 15:59:40 crc kubenswrapper[4778]: I1205 15:59:40.632768 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 15:59:40 crc kubenswrapper[4778]: I1205 15:59:40.954633 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 15:59:41 crc kubenswrapper[4778]: I1205 15:59:41.258349 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e078d9-a539-4626-b30f-908b8e866a7a" path="/var/lib/kubelet/pods/21e078d9-a539-4626-b30f-908b8e866a7a/volumes" Dec 05 15:59:41 crc kubenswrapper[4778]: I1205 15:59:41.296975 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 15:59:41 crc kubenswrapper[4778]: I1205 15:59:41.527909 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 15:59:41 crc kubenswrapper[4778]: I1205 15:59:41.765192 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 15:59:41 crc kubenswrapper[4778]: I1205 15:59:41.999806 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.016130 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.123947 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.198169 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.255500 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.542984 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.543077 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.565605 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.581598 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.655226 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.674519 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.714968 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.781043 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.799674 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.898300 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 15:59:42 crc kubenswrapper[4778]: I1205 15:59:42.967147 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 15:59:43 crc kubenswrapper[4778]: I1205 15:59:43.012169 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 15:59:43 crc kubenswrapper[4778]: I1205 15:59:43.081098 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 15:59:43 crc kubenswrapper[4778]: I1205 15:59:43.081297 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 15:59:43 crc kubenswrapper[4778]: I1205 15:59:43.219431 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 15:59:43 crc kubenswrapper[4778]: I1205 15:59:43.301548 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 15:59:43 crc kubenswrapper[4778]: I1205 15:59:43.398394 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 15:59:43 crc kubenswrapper[4778]: I1205 15:59:43.480508 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 15:59:43 crc kubenswrapper[4778]: I1205 15:59:43.539522 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 15:59:43 crc kubenswrapper[4778]: I1205 15:59:43.638992 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 15:59:43 crc kubenswrapper[4778]: I1205 15:59:43.718315 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 15:59:43 crc kubenswrapper[4778]: I1205 15:59:43.873586 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 15:59:43 crc kubenswrapper[4778]: I1205 15:59:43.993188 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 15:59:44 crc kubenswrapper[4778]: I1205 15:59:44.074545 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 15:59:44 crc kubenswrapper[4778]: I1205 15:59:44.122883 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 15:59:44 crc kubenswrapper[4778]: I1205 15:59:44.266161 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 15:59:44 crc kubenswrapper[4778]: I1205 15:59:44.365761 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 15:59:44 crc kubenswrapper[4778]: I1205 15:59:44.422590 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 15:59:44 crc kubenswrapper[4778]: I1205 15:59:44.481584 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 15:59:44 crc kubenswrapper[4778]: I1205 15:59:44.536154 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 15:59:44 crc kubenswrapper[4778]: I1205 15:59:44.634097 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 15:59:44 crc kubenswrapper[4778]: I1205 15:59:44.734682 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 15:59:44 crc kubenswrapper[4778]: I1205 15:59:44.743330 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 15:59:44 crc kubenswrapper[4778]: I1205 15:59:44.752270 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 15:59:44 crc kubenswrapper[4778]: I1205 15:59:44.898315 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.009687 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.031791 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.095757 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.137181 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.373349 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.479637 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.490757 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.529843 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.537279 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.552032 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.603037 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.730413 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.806205 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.813560 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.843942 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 15:59:45 crc kubenswrapper[4778]: I1205 15:59:45.851462 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.152412 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.152494 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.152911 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.153067 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.157013 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.157191 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.157329 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.157726 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.246224 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.311606 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.353114 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.451489 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.465433 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.489734 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.506858 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.516104 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.530069 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.575531 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.668445 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.673017 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.764843 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.813778 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.850476 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.868838 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.896130 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.911277 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.930336 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.958317 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.992097 4778 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 15:59:46 crc kubenswrapper[4778]: I1205 15:59:46.999434 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.006470 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.016076 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.026741 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.189563 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.411240 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.456670 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.561081 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.699994 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.709123 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.717541 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.745823 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.855016 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.872633 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.955827 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.959350 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 15:59:47 crc kubenswrapper[4778]: I1205 15:59:47.995965 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.060311 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.095258 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.100091 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.177864 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.238906 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.317012 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.372163 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.395108 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.407549 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.435209 4778 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.552322 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.568985 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.576678 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.584775 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.616084 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.714993 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.865983 4778 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 15:59:48 crc kubenswrapper[4778]: I1205 15:59:48.923777 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.034903 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.064645 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.112730 4778 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.145977 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.231750 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.233455 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.290472 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.373562 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.398404 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.445414 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.477916 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.484280 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.498785 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.545994 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.565777 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.623997 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.624781 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.625655 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.788988 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.851473 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 15:59:49 crc kubenswrapper[4778]: I1205 15:59:49.967415 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.035096 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.048623 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.304350 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.307091 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.312194 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.320749 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.346199 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.393697 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.486535 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.489873 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.574840 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.649152 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.713538 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.942650 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 15:59:50 crc kubenswrapper[4778]: I1205 15:59:50.993331 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.042136 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.145630 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.147523 4778 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.147756 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://70022343adb7501fd2d69e4156619607baf93f11912945512e2dcac0cfe126e1" gracePeriod=5 Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.185434 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.192130 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.259655 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.293000 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.436630 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.611685 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.681199 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.703178 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.758227 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.778647 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.786781 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.837332 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.849905 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.897332 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 15:59:51 crc kubenswrapper[4778]: I1205 15:59:51.961646 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.059831 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.122586 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.165121 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.241946 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.347785 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.377038 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.399046 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.478726 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.542725 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.542870 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.542967 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.544216 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"1af4e0ce04400b6434bdaaaddc87daebb09c4ec7e5662c7f230af4c6897ede6f"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.544398 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://1af4e0ce04400b6434bdaaaddc87daebb09c4ec7e5662c7f230af4c6897ede6f" gracePeriod=30 Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.587870 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.601143 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.844655 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-985c66b4-cd9wv"] Dec 05 15:59:52 crc kubenswrapper[4778]: E1205 15:59:52.844867 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.844882 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 15:59:52 crc kubenswrapper[4778]: E1205 15:59:52.844900 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e078d9-a539-4626-b30f-908b8e866a7a" containerName="oauth-openshift" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.844909 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e078d9-a539-4626-b30f-908b8e866a7a" containerName="oauth-openshift" Dec 05 15:59:52 crc kubenswrapper[4778]: E1205 15:59:52.844928 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ebd611-e2d6-45da-a673-29cae7deb0c1" containerName="installer" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.844937 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ebd611-e2d6-45da-a673-29cae7deb0c1" containerName="installer" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.845043 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e078d9-a539-4626-b30f-908b8e866a7a" containerName="oauth-openshift" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.845059 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.845070 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ebd611-e2d6-45da-a673-29cae7deb0c1" containerName="installer" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.845487 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.847743 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.847880 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.848829 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.848930 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.849167 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.849724 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.850014 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.850211 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.850416 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.850511 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.850839 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.854033 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.855848 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4799c5f7-2295-44d3-9e46-a6aaf8523486-audit-policies\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.855893 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-router-certs\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.855992 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-cliconfig\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.856054 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-serving-cert\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.856117 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-session\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.859112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.863655 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-user-template-error\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.863763 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.863920 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-service-ca\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.864031 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-user-template-login\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.864128 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxplv\" (UniqueName: \"kubernetes.io/projected/4799c5f7-2295-44d3-9e46-a6aaf8523486-kube-api-access-kxplv\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.864242 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.864407 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.864547 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4799c5f7-2295-44d3-9e46-a6aaf8523486-audit-dir\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.869981 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.869985 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.884842 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-985c66b4-cd9wv"] Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.886427 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.929886 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.966894 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967057 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4799c5f7-2295-44d3-9e46-a6aaf8523486-audit-dir\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967102 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4799c5f7-2295-44d3-9e46-a6aaf8523486-audit-policies\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967139 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-router-certs\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967213 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-cliconfig\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967261 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-serving-cert\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967306 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967338 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-session\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967410 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-user-template-error\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967451 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967512 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-service-ca\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967554 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-user-template-login\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967590 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxplv\" (UniqueName: \"kubernetes.io/projected/4799c5f7-2295-44d3-9e46-a6aaf8523486-kube-api-access-kxplv\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.967629 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.970351 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4799c5f7-2295-44d3-9e46-a6aaf8523486-audit-policies\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.970436 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4799c5f7-2295-44d3-9e46-a6aaf8523486-audit-dir\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.970687 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.970832 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-cliconfig\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.977592 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.983210 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-service-ca\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.997952 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-session\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.998315 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-serving-cert\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.998656 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:52 crc kubenswrapper[4778]: I1205 15:59:52.999217 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-user-template-login\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.000437 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.000684 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-user-template-error\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.000835 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4799c5f7-2295-44d3-9e46-a6aaf8523486-v4-0-config-system-router-certs\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.018027 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxplv\" (UniqueName: \"kubernetes.io/projected/4799c5f7-2295-44d3-9e46-a6aaf8523486-kube-api-access-kxplv\") pod \"oauth-openshift-985c66b4-cd9wv\" (UID: \"4799c5f7-2295-44d3-9e46-a6aaf8523486\") " pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.171825 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.225587 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.268314 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.417643 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.468298 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.484316 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-985c66b4-cd9wv"] Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.587669 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.622741 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.752015 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.755391 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.787472 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.807324 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.821247 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.826467 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" event={"ID":"4799c5f7-2295-44d3-9e46-a6aaf8523486","Type":"ContainerStarted","Data":"61770e82540c3095ac5fc0fa2b89a6abcc3df098bd798d577cf0a8f68c8d277e"} Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.837545 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.882630 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.924168 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 15:59:53 crc kubenswrapper[4778]: I1205 15:59:53.995732 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.016065 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.079706 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.240773 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.265835 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.335112 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.566002 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.584430 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.628089 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.682048 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.809113 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.834323 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" event={"ID":"4799c5f7-2295-44d3-9e46-a6aaf8523486","Type":"ContainerStarted","Data":"0211e1d2f384ad258567f7d96e3f23cfcca82e495983c7cdf821812659f48adb"} Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.834626 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.858049 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.868969 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-985c66b4-cd9wv" podStartSLOduration=55.868941522 podStartE2EDuration="55.868941522s" podCreationTimestamp="2025-12-05 15:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 15:59:54.868770408 +0000 UTC m=+281.972566798" watchObservedRunningTime="2025-12-05 15:59:54.868941522 +0000 UTC m=+281.972737912" Dec 05 15:59:54 crc kubenswrapper[4778]: I1205 15:59:54.934151 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 15:59:55 crc kubenswrapper[4778]: I1205 15:59:55.141258 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 15:59:55 crc kubenswrapper[4778]: I1205 15:59:55.220602 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 15:59:55 crc kubenswrapper[4778]: I1205 15:59:55.393830 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 15:59:55 crc kubenswrapper[4778]: I1205 15:59:55.668037 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 15:59:55 crc kubenswrapper[4778]: I1205 15:59:55.875777 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 15:59:55 crc kubenswrapper[4778]: I1205 15:59:55.937455 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 15:59:55 crc kubenswrapper[4778]: I1205 15:59:55.940943 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 15:59:55 crc kubenswrapper[4778]: I1205 15:59:55.951469 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.298022 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.298116 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.360236 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.361296 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.361411 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.361410 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.361494 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.361495 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.361526 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.361551 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.361568 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.361689 4778 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.361701 4778 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.361710 4778 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.361762 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.379644 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.441346 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.457222 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.462430 4778 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.462464 4778 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.510069 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.848693 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.849084 4778 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="70022343adb7501fd2d69e4156619607baf93f11912945512e2dcac0cfe126e1" exitCode=137 Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.849189 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.849286 4778 scope.go:117] "RemoveContainer" containerID="70022343adb7501fd2d69e4156619607baf93f11912945512e2dcac0cfe126e1" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.894029 4778 scope.go:117] "RemoveContainer" containerID="70022343adb7501fd2d69e4156619607baf93f11912945512e2dcac0cfe126e1" Dec 05 15:59:56 crc kubenswrapper[4778]: E1205 15:59:56.895150 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70022343adb7501fd2d69e4156619607baf93f11912945512e2dcac0cfe126e1\": container with ID starting with 70022343adb7501fd2d69e4156619607baf93f11912945512e2dcac0cfe126e1 not found: ID does not exist" containerID="70022343adb7501fd2d69e4156619607baf93f11912945512e2dcac0cfe126e1" Dec 05 15:59:56 crc kubenswrapper[4778]: I1205 15:59:56.895249 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70022343adb7501fd2d69e4156619607baf93f11912945512e2dcac0cfe126e1"} err="failed to get container status \"70022343adb7501fd2d69e4156619607baf93f11912945512e2dcac0cfe126e1\": rpc error: code = NotFound desc = could not find container \"70022343adb7501fd2d69e4156619607baf93f11912945512e2dcac0cfe126e1\": container with ID starting with 70022343adb7501fd2d69e4156619607baf93f11912945512e2dcac0cfe126e1 not found: ID does not exist" Dec 05 15:59:57 crc kubenswrapper[4778]: I1205 15:59:57.091625 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 15:59:57 crc kubenswrapper[4778]: I1205 15:59:57.225878 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 15:59:57 crc kubenswrapper[4778]: I1205 15:59:57.261988 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 05 15:59:57 crc kubenswrapper[4778]: I1205 15:59:57.262513 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 05 15:59:57 crc kubenswrapper[4778]: I1205 15:59:57.280580 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 15:59:57 crc kubenswrapper[4778]: I1205 15:59:57.280647 4778 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="6a551a5d-8802-48e2-bf11-ad51366f036e" Dec 05 15:59:57 crc kubenswrapper[4778]: I1205 15:59:57.288938 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 15:59:57 crc kubenswrapper[4778]: I1205 15:59:57.289000 4778 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="6a551a5d-8802-48e2-bf11-ad51366f036e" Dec 05 15:59:57 crc kubenswrapper[4778]: I1205 15:59:57.593922 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 15:59:58 crc kubenswrapper[4778]: I1205 15:59:58.149176 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 16:00:13 crc kubenswrapper[4778]: I1205 16:00:13.094025 4778 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 05 16:00:14 crc kubenswrapper[4778]: I1205 16:00:14.971117 4778 generic.go:334] "Generic (PLEG): container finished" podID="66a3882a-e9bc-40d4-b51f-e47d9354f53a" containerID="e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9" exitCode=0 Dec 05 16:00:14 crc kubenswrapper[4778]: I1205 16:00:14.971279 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" event={"ID":"66a3882a-e9bc-40d4-b51f-e47d9354f53a","Type":"ContainerDied","Data":"e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9"} Dec 05 16:00:14 crc kubenswrapper[4778]: I1205 16:00:14.972291 4778 scope.go:117] "RemoveContainer" containerID="e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9" Dec 05 16:00:15 crc kubenswrapper[4778]: I1205 16:00:15.980695 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" event={"ID":"66a3882a-e9bc-40d4-b51f-e47d9354f53a","Type":"ContainerStarted","Data":"1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9"} Dec 05 16:00:15 crc kubenswrapper[4778]: I1205 16:00:15.981976 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 16:00:15 crc kubenswrapper[4778]: I1205 16:00:15.983771 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 16:00:23 crc kubenswrapper[4778]: I1205 16:00:23.034175 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 16:00:23 crc kubenswrapper[4778]: I1205 16:00:23.038659 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 16:00:23 crc kubenswrapper[4778]: I1205 16:00:23.038742 4778 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1af4e0ce04400b6434bdaaaddc87daebb09c4ec7e5662c7f230af4c6897ede6f" exitCode=137 Dec 05 16:00:23 crc kubenswrapper[4778]: I1205 16:00:23.038788 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1af4e0ce04400b6434bdaaaddc87daebb09c4ec7e5662c7f230af4c6897ede6f"} Dec 05 16:00:23 crc kubenswrapper[4778]: I1205 16:00:23.038835 4778 scope.go:117] "RemoveContainer" containerID="41f9189fe66413039ae1c0470a69dd0c9479c28a3bebc913b61b6955fc4b476e" Dec 05 16:00:24 crc kubenswrapper[4778]: I1205 16:00:24.049952 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 16:00:24 crc kubenswrapper[4778]: I1205 16:00:24.061023 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"23808d5fc2920e629b92984a6c3eb402963fafa308c067a19261d0b1f917e66d"} Dec 05 16:00:26 crc kubenswrapper[4778]: I1205 16:00:26.438011 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:00:32 crc kubenswrapper[4778]: I1205 16:00:32.542524 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:00:32 crc kubenswrapper[4778]: I1205 16:00:32.549305 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:00:33 crc kubenswrapper[4778]: I1205 16:00:33.118005 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:00:33 crc kubenswrapper[4778]: I1205 16:00:33.415218 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:00:33 crc kubenswrapper[4778]: I1205 16:00:33.415328 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.615625 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr"] Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.617112 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.622611 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.634685 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.644752 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr"] Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.664881 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4sh2x"] Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.665206 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" podUID="10a607f0-e1ef-405d-9771-54076793d426" containerName="controller-manager" containerID="cri-o://8f812803cfe9ef7eef20d31d0db2188c48d3e35f9969ba0edf7062582b4508d6" gracePeriod=30 Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.671747 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh"] Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.672054 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" podUID="782cf494-3079-47fe-8f6c-f7d5731a5b69" containerName="route-controller-manager" containerID="cri-o://c1782157bd3a3558c3268f513bbc8b6601d6c8ebed287c582554e9b40d194492" gracePeriod=30 Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.787439 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-config-volume\") pod \"collect-profiles-29415840-5r8wr\" (UID: \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.787505 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7kx4\" (UniqueName: \"kubernetes.io/projected/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-kube-api-access-j7kx4\") pod \"collect-profiles-29415840-5r8wr\" (UID: \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.787541 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-secret-volume\") pod \"collect-profiles-29415840-5r8wr\" (UID: \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.841912 4778 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6jmzh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.842210 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" podUID="782cf494-3079-47fe-8f6c-f7d5731a5b69" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.867948 4778 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4sh2x container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.868019 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" podUID="10a607f0-e1ef-405d-9771-54076793d426" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.888578 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-config-volume\") pod \"collect-profiles-29415840-5r8wr\" (UID: \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.888707 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7kx4\" (UniqueName: \"kubernetes.io/projected/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-kube-api-access-j7kx4\") pod \"collect-profiles-29415840-5r8wr\" (UID: \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.889086 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-secret-volume\") pod \"collect-profiles-29415840-5r8wr\" (UID: \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.889476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-config-volume\") pod \"collect-profiles-29415840-5r8wr\" (UID: \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.908995 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-secret-volume\") pod \"collect-profiles-29415840-5r8wr\" (UID: \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.909613 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7kx4\" (UniqueName: \"kubernetes.io/projected/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-kube-api-access-j7kx4\") pod \"collect-profiles-29415840-5r8wr\" (UID: \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:39 crc kubenswrapper[4778]: I1205 16:00:39.936798 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.163538 4778 generic.go:334] "Generic (PLEG): container finished" podID="782cf494-3079-47fe-8f6c-f7d5731a5b69" containerID="c1782157bd3a3558c3268f513bbc8b6601d6c8ebed287c582554e9b40d194492" exitCode=0 Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.163599 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" event={"ID":"782cf494-3079-47fe-8f6c-f7d5731a5b69","Type":"ContainerDied","Data":"c1782157bd3a3558c3268f513bbc8b6601d6c8ebed287c582554e9b40d194492"} Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.165892 4778 generic.go:334] "Generic (PLEG): container finished" podID="10a607f0-e1ef-405d-9771-54076793d426" containerID="8f812803cfe9ef7eef20d31d0db2188c48d3e35f9969ba0edf7062582b4508d6" exitCode=0 Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.165919 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" event={"ID":"10a607f0-e1ef-405d-9771-54076793d426","Type":"ContainerDied","Data":"8f812803cfe9ef7eef20d31d0db2188c48d3e35f9969ba0edf7062582b4508d6"} Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.211618 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.217463 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.439267 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/782cf494-3079-47fe-8f6c-f7d5731a5b69-serving-cert\") pod \"782cf494-3079-47fe-8f6c-f7d5731a5b69\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.439573 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/782cf494-3079-47fe-8f6c-f7d5731a5b69-config\") pod \"782cf494-3079-47fe-8f6c-f7d5731a5b69\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.439654 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-proxy-ca-bundles\") pod \"10a607f0-e1ef-405d-9771-54076793d426\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.439686 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv2rp\" (UniqueName: \"kubernetes.io/projected/10a607f0-e1ef-405d-9771-54076793d426-kube-api-access-hv2rp\") pod \"10a607f0-e1ef-405d-9771-54076793d426\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.439725 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rbqx\" (UniqueName: \"kubernetes.io/projected/782cf494-3079-47fe-8f6c-f7d5731a5b69-kube-api-access-5rbqx\") pod \"782cf494-3079-47fe-8f6c-f7d5731a5b69\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.439756 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/782cf494-3079-47fe-8f6c-f7d5731a5b69-client-ca\") pod \"782cf494-3079-47fe-8f6c-f7d5731a5b69\" (UID: \"782cf494-3079-47fe-8f6c-f7d5731a5b69\") " Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.439796 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-client-ca\") pod \"10a607f0-e1ef-405d-9771-54076793d426\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.439813 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10a607f0-e1ef-405d-9771-54076793d426-serving-cert\") pod \"10a607f0-e1ef-405d-9771-54076793d426\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.439873 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-config\") pod \"10a607f0-e1ef-405d-9771-54076793d426\" (UID: \"10a607f0-e1ef-405d-9771-54076793d426\") " Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.441423 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/782cf494-3079-47fe-8f6c-f7d5731a5b69-config" (OuterVolumeSpecName: "config") pod "782cf494-3079-47fe-8f6c-f7d5731a5b69" (UID: "782cf494-3079-47fe-8f6c-f7d5731a5b69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.441949 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "10a607f0-e1ef-405d-9771-54076793d426" (UID: "10a607f0-e1ef-405d-9771-54076793d426"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.441966 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-config" (OuterVolumeSpecName: "config") pod "10a607f0-e1ef-405d-9771-54076793d426" (UID: "10a607f0-e1ef-405d-9771-54076793d426"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.442598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-client-ca" (OuterVolumeSpecName: "client-ca") pod "10a607f0-e1ef-405d-9771-54076793d426" (UID: "10a607f0-e1ef-405d-9771-54076793d426"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.442664 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/782cf494-3079-47fe-8f6c-f7d5731a5b69-client-ca" (OuterVolumeSpecName: "client-ca") pod "782cf494-3079-47fe-8f6c-f7d5731a5b69" (UID: "782cf494-3079-47fe-8f6c-f7d5731a5b69"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.449038 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/782cf494-3079-47fe-8f6c-f7d5731a5b69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "782cf494-3079-47fe-8f6c-f7d5731a5b69" (UID: "782cf494-3079-47fe-8f6c-f7d5731a5b69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.455458 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10a607f0-e1ef-405d-9771-54076793d426-kube-api-access-hv2rp" (OuterVolumeSpecName: "kube-api-access-hv2rp") pod "10a607f0-e1ef-405d-9771-54076793d426" (UID: "10a607f0-e1ef-405d-9771-54076793d426"). InnerVolumeSpecName "kube-api-access-hv2rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.455511 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a607f0-e1ef-405d-9771-54076793d426-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "10a607f0-e1ef-405d-9771-54076793d426" (UID: "10a607f0-e1ef-405d-9771-54076793d426"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.456416 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/782cf494-3079-47fe-8f6c-f7d5731a5b69-kube-api-access-5rbqx" (OuterVolumeSpecName: "kube-api-access-5rbqx") pod "782cf494-3079-47fe-8f6c-f7d5731a5b69" (UID: "782cf494-3079-47fe-8f6c-f7d5731a5b69"). InnerVolumeSpecName "kube-api-access-5rbqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.471914 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr"] Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.541001 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rbqx\" (UniqueName: \"kubernetes.io/projected/782cf494-3079-47fe-8f6c-f7d5731a5b69-kube-api-access-5rbqx\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.541290 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/782cf494-3079-47fe-8f6c-f7d5731a5b69-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.541302 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.541311 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10a607f0-e1ef-405d-9771-54076793d426-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.541321 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.541332 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/782cf494-3079-47fe-8f6c-f7d5731a5b69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.541342 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/782cf494-3079-47fe-8f6c-f7d5731a5b69-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.541350 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10a607f0-e1ef-405d-9771-54076793d426-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:40 crc kubenswrapper[4778]: I1205 16:00:40.541358 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv2rp\" (UniqueName: \"kubernetes.io/projected/10a607f0-e1ef-405d-9771-54076793d426-kube-api-access-hv2rp\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.172462 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.172454 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4sh2x" event={"ID":"10a607f0-e1ef-405d-9771-54076793d426","Type":"ContainerDied","Data":"165e85a7da479e2ff22022cad2f2b3a35405f6d6fe21fb4b6b33d22e15700323"} Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.172960 4778 scope.go:117] "RemoveContainer" containerID="8f812803cfe9ef7eef20d31d0db2188c48d3e35f9969ba0edf7062582b4508d6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.174182 4778 generic.go:334] "Generic (PLEG): container finished" podID="1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60" containerID="f97cc2bd8fb7503e742293e22094fbb1c486e852a0e7f18b848021e492d704cf" exitCode=0 Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.174264 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" event={"ID":"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60","Type":"ContainerDied","Data":"f97cc2bd8fb7503e742293e22094fbb1c486e852a0e7f18b848021e492d704cf"} Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.174296 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" event={"ID":"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60","Type":"ContainerStarted","Data":"c26e4113d98199413addf749d62d68691441500d19f7465b33946f16d9a9626e"} Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.175965 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" event={"ID":"782cf494-3079-47fe-8f6c-f7d5731a5b69","Type":"ContainerDied","Data":"7d89690147654bcae8e19db3ec7c3d41285c91f13fc30035523798dc3826440f"} Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.176127 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.204580 4778 scope.go:117] "RemoveContainer" containerID="c1782157bd3a3558c3268f513bbc8b6601d6c8ebed287c582554e9b40d194492" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.218514 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh"] Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.226303 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jmzh"] Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.237146 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4sh2x"] Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.241465 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4sh2x"] Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.261541 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10a607f0-e1ef-405d-9771-54076793d426" path="/var/lib/kubelet/pods/10a607f0-e1ef-405d-9771-54076793d426/volumes" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.262623 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="782cf494-3079-47fe-8f6c-f7d5731a5b69" path="/var/lib/kubelet/pods/782cf494-3079-47fe-8f6c-f7d5731a5b69/volumes" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.501910 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m"] Dec 05 16:00:41 crc kubenswrapper[4778]: E1205 16:00:41.502333 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a607f0-e1ef-405d-9771-54076793d426" containerName="controller-manager" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.502361 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a607f0-e1ef-405d-9771-54076793d426" containerName="controller-manager" Dec 05 16:00:41 crc kubenswrapper[4778]: E1205 16:00:41.502413 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782cf494-3079-47fe-8f6c-f7d5731a5b69" containerName="route-controller-manager" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.502428 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="782cf494-3079-47fe-8f6c-f7d5731a5b69" containerName="route-controller-manager" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.502599 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a607f0-e1ef-405d-9771-54076793d426" containerName="controller-manager" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.502631 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="782cf494-3079-47fe-8f6c-f7d5731a5b69" containerName="route-controller-manager" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.503247 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.505474 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.505654 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6"] Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.505831 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.506518 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.508322 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.508726 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.510941 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.511414 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.511702 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.512232 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.513254 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.513283 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.513539 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.514561 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.518669 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6"] Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.521167 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.524947 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m"] Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.551311 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq5tv\" (UniqueName: \"kubernetes.io/projected/5fe28cf2-0108-44f9-b364-90c846220e36-kube-api-access-bq5tv\") pod \"route-controller-manager-b4dbb9d8d-6drd6\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.551486 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5fe28cf2-0108-44f9-b364-90c846220e36-client-ca\") pod \"route-controller-manager-b4dbb9d8d-6drd6\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.551539 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-serving-cert\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.551615 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-client-ca\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.551660 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-config\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.551710 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9hm5\" (UniqueName: \"kubernetes.io/projected/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-kube-api-access-k9hm5\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.551755 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-proxy-ca-bundles\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.551853 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fe28cf2-0108-44f9-b364-90c846220e36-serving-cert\") pod \"route-controller-manager-b4dbb9d8d-6drd6\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.551908 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fe28cf2-0108-44f9-b364-90c846220e36-config\") pod \"route-controller-manager-b4dbb9d8d-6drd6\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.653003 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq5tv\" (UniqueName: \"kubernetes.io/projected/5fe28cf2-0108-44f9-b364-90c846220e36-kube-api-access-bq5tv\") pod \"route-controller-manager-b4dbb9d8d-6drd6\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.653089 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5fe28cf2-0108-44f9-b364-90c846220e36-client-ca\") pod \"route-controller-manager-b4dbb9d8d-6drd6\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.653127 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-serving-cert\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.653197 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-client-ca\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.653239 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-config\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.653273 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9hm5\" (UniqueName: \"kubernetes.io/projected/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-kube-api-access-k9hm5\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.653309 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-proxy-ca-bundles\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.653410 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fe28cf2-0108-44f9-b364-90c846220e36-serving-cert\") pod \"route-controller-manager-b4dbb9d8d-6drd6\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.653447 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fe28cf2-0108-44f9-b364-90c846220e36-config\") pod \"route-controller-manager-b4dbb9d8d-6drd6\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.656587 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5fe28cf2-0108-44f9-b364-90c846220e36-client-ca\") pod \"route-controller-manager-b4dbb9d8d-6drd6\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.656679 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-client-ca\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.656936 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fe28cf2-0108-44f9-b364-90c846220e36-config\") pod \"route-controller-manager-b4dbb9d8d-6drd6\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.657405 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-config\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.661470 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-proxy-ca-bundles\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.662713 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fe28cf2-0108-44f9-b364-90c846220e36-serving-cert\") pod \"route-controller-manager-b4dbb9d8d-6drd6\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.663662 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-serving-cert\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.674227 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9hm5\" (UniqueName: \"kubernetes.io/projected/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-kube-api-access-k9hm5\") pod \"controller-manager-5cf665bbc7-7tl8m\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.680192 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq5tv\" (UniqueName: \"kubernetes.io/projected/5fe28cf2-0108-44f9-b364-90c846220e36-kube-api-access-bq5tv\") pod \"route-controller-manager-b4dbb9d8d-6drd6\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.858222 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:41 crc kubenswrapper[4778]: I1205 16:00:41.867671 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:42 crc kubenswrapper[4778]: I1205 16:00:42.346709 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m"] Dec 05 16:00:42 crc kubenswrapper[4778]: I1205 16:00:42.365481 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6"] Dec 05 16:00:42 crc kubenswrapper[4778]: I1205 16:00:42.512952 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:42 crc kubenswrapper[4778]: I1205 16:00:42.570063 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-config-volume\") pod \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\" (UID: \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\") " Dec 05 16:00:42 crc kubenswrapper[4778]: I1205 16:00:42.570419 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-secret-volume\") pod \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\" (UID: \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\") " Dec 05 16:00:42 crc kubenswrapper[4778]: I1205 16:00:42.570467 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7kx4\" (UniqueName: \"kubernetes.io/projected/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-kube-api-access-j7kx4\") pod \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\" (UID: \"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60\") " Dec 05 16:00:42 crc kubenswrapper[4778]: I1205 16:00:42.571062 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-config-volume" (OuterVolumeSpecName: "config-volume") pod "1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60" (UID: "1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:00:42 crc kubenswrapper[4778]: I1205 16:00:42.572539 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:42 crc kubenswrapper[4778]: I1205 16:00:42.576109 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60" (UID: "1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:00:42 crc kubenswrapper[4778]: I1205 16:00:42.579475 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-kube-api-access-j7kx4" (OuterVolumeSpecName: "kube-api-access-j7kx4") pod "1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60" (UID: "1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60"). InnerVolumeSpecName "kube-api-access-j7kx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:00:42 crc kubenswrapper[4778]: I1205 16:00:42.673396 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:42 crc kubenswrapper[4778]: I1205 16:00:42.673456 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7kx4\" (UniqueName: \"kubernetes.io/projected/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60-kube-api-access-j7kx4\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.192109 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" event={"ID":"ebde0a6c-6b90-4bce-9174-324dd7ce48ad","Type":"ContainerStarted","Data":"87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea"} Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.192160 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" event={"ID":"ebde0a6c-6b90-4bce-9174-324dd7ce48ad","Type":"ContainerStarted","Data":"4991c8acec1b8ad72bf5eadbb0a9c73f31a394d1cd643b3d5683f31da3911708"} Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.192184 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.194950 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" event={"ID":"1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60","Type":"ContainerDied","Data":"c26e4113d98199413addf749d62d68691441500d19f7465b33946f16d9a9626e"} Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.194983 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c26e4113d98199413addf749d62d68691441500d19f7465b33946f16d9a9626e" Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.195168 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr" Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.196228 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" event={"ID":"5fe28cf2-0108-44f9-b364-90c846220e36","Type":"ContainerStarted","Data":"74734f95065baa768d637b46f723a9bd8258542654b9797e62935fd038d54a16"} Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.196282 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" event={"ID":"5fe28cf2-0108-44f9-b364-90c846220e36","Type":"ContainerStarted","Data":"d2ba4d8a9b065604cc1eda3c3cbf672fa88dc674f4dc0c2a6bb503b5e937aa78"} Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.196388 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.196830 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.201656 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.215191 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" podStartSLOduration=4.215172952 podStartE2EDuration="4.215172952s" podCreationTimestamp="2025-12-05 16:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:00:43.209292931 +0000 UTC m=+330.313089321" watchObservedRunningTime="2025-12-05 16:00:43.215172952 +0000 UTC m=+330.318969332" Dec 05 16:00:43 crc kubenswrapper[4778]: I1205 16:00:43.230707 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" podStartSLOduration=4.230686448 podStartE2EDuration="4.230686448s" podCreationTimestamp="2025-12-05 16:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:00:43.229137745 +0000 UTC m=+330.332934135" watchObservedRunningTime="2025-12-05 16:00:43.230686448 +0000 UTC m=+330.334482828" Dec 05 16:00:44 crc kubenswrapper[4778]: I1205 16:00:44.910228 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6"] Dec 05 16:00:46 crc kubenswrapper[4778]: I1205 16:00:46.211850 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" podUID="5fe28cf2-0108-44f9-b364-90c846220e36" containerName="route-controller-manager" containerID="cri-o://74734f95065baa768d637b46f723a9bd8258542654b9797e62935fd038d54a16" gracePeriod=30 Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.217444 4778 generic.go:334] "Generic (PLEG): container finished" podID="5fe28cf2-0108-44f9-b364-90c846220e36" containerID="74734f95065baa768d637b46f723a9bd8258542654b9797e62935fd038d54a16" exitCode=0 Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.217724 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" event={"ID":"5fe28cf2-0108-44f9-b364-90c846220e36","Type":"ContainerDied","Data":"74734f95065baa768d637b46f723a9bd8258542654b9797e62935fd038d54a16"} Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.621574 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.643156 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq5tv\" (UniqueName: \"kubernetes.io/projected/5fe28cf2-0108-44f9-b364-90c846220e36-kube-api-access-bq5tv\") pod \"5fe28cf2-0108-44f9-b364-90c846220e36\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.643212 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5fe28cf2-0108-44f9-b364-90c846220e36-client-ca\") pod \"5fe28cf2-0108-44f9-b364-90c846220e36\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.643273 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fe28cf2-0108-44f9-b364-90c846220e36-serving-cert\") pod \"5fe28cf2-0108-44f9-b364-90c846220e36\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.643323 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fe28cf2-0108-44f9-b364-90c846220e36-config\") pod \"5fe28cf2-0108-44f9-b364-90c846220e36\" (UID: \"5fe28cf2-0108-44f9-b364-90c846220e36\") " Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.643985 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fe28cf2-0108-44f9-b364-90c846220e36-client-ca" (OuterVolumeSpecName: "client-ca") pod "5fe28cf2-0108-44f9-b364-90c846220e36" (UID: "5fe28cf2-0108-44f9-b364-90c846220e36"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.644079 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fe28cf2-0108-44f9-b364-90c846220e36-config" (OuterVolumeSpecName: "config") pod "5fe28cf2-0108-44f9-b364-90c846220e36" (UID: "5fe28cf2-0108-44f9-b364-90c846220e36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.649881 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe28cf2-0108-44f9-b364-90c846220e36-kube-api-access-bq5tv" (OuterVolumeSpecName: "kube-api-access-bq5tv") pod "5fe28cf2-0108-44f9-b364-90c846220e36" (UID: "5fe28cf2-0108-44f9-b364-90c846220e36"). InnerVolumeSpecName "kube-api-access-bq5tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.650411 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe28cf2-0108-44f9-b364-90c846220e36-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5fe28cf2-0108-44f9-b364-90c846220e36" (UID: "5fe28cf2-0108-44f9-b364-90c846220e36"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.652585 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw"] Dec 05 16:00:47 crc kubenswrapper[4778]: E1205 16:00:47.652868 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60" containerName="collect-profiles" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.652935 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60" containerName="collect-profiles" Dec 05 16:00:47 crc kubenswrapper[4778]: E1205 16:00:47.652998 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe28cf2-0108-44f9-b364-90c846220e36" containerName="route-controller-manager" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.653052 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe28cf2-0108-44f9-b364-90c846220e36" containerName="route-controller-manager" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.653187 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60" containerName="collect-profiles" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.653247 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fe28cf2-0108-44f9-b364-90c846220e36" containerName="route-controller-manager" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.653664 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.667736 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw"] Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.744720 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de6520e-beb1-455c-9db4-dce28d2490a6-config\") pod \"route-controller-manager-7f8fd8bddb-5lnpw\" (UID: \"6de6520e-beb1-455c-9db4-dce28d2490a6\") " pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.745748 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6de6520e-beb1-455c-9db4-dce28d2490a6-serving-cert\") pod \"route-controller-manager-7f8fd8bddb-5lnpw\" (UID: \"6de6520e-beb1-455c-9db4-dce28d2490a6\") " pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.745945 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfspw\" (UniqueName: \"kubernetes.io/projected/6de6520e-beb1-455c-9db4-dce28d2490a6-kube-api-access-kfspw\") pod \"route-controller-manager-7f8fd8bddb-5lnpw\" (UID: \"6de6520e-beb1-455c-9db4-dce28d2490a6\") " pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.746101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6de6520e-beb1-455c-9db4-dce28d2490a6-client-ca\") pod \"route-controller-manager-7f8fd8bddb-5lnpw\" (UID: \"6de6520e-beb1-455c-9db4-dce28d2490a6\") " pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.746320 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq5tv\" (UniqueName: \"kubernetes.io/projected/5fe28cf2-0108-44f9-b364-90c846220e36-kube-api-access-bq5tv\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.746414 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5fe28cf2-0108-44f9-b364-90c846220e36-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.746475 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fe28cf2-0108-44f9-b364-90c846220e36-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.746531 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fe28cf2-0108-44f9-b364-90c846220e36-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.848279 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6de6520e-beb1-455c-9db4-dce28d2490a6-serving-cert\") pod \"route-controller-manager-7f8fd8bddb-5lnpw\" (UID: \"6de6520e-beb1-455c-9db4-dce28d2490a6\") " pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.848918 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfspw\" (UniqueName: \"kubernetes.io/projected/6de6520e-beb1-455c-9db4-dce28d2490a6-kube-api-access-kfspw\") pod \"route-controller-manager-7f8fd8bddb-5lnpw\" (UID: \"6de6520e-beb1-455c-9db4-dce28d2490a6\") " pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.849318 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6de6520e-beb1-455c-9db4-dce28d2490a6-client-ca\") pod \"route-controller-manager-7f8fd8bddb-5lnpw\" (UID: \"6de6520e-beb1-455c-9db4-dce28d2490a6\") " pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.849584 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de6520e-beb1-455c-9db4-dce28d2490a6-config\") pod \"route-controller-manager-7f8fd8bddb-5lnpw\" (UID: \"6de6520e-beb1-455c-9db4-dce28d2490a6\") " pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.850345 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6de6520e-beb1-455c-9db4-dce28d2490a6-client-ca\") pod \"route-controller-manager-7f8fd8bddb-5lnpw\" (UID: \"6de6520e-beb1-455c-9db4-dce28d2490a6\") " pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.853781 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de6520e-beb1-455c-9db4-dce28d2490a6-config\") pod \"route-controller-manager-7f8fd8bddb-5lnpw\" (UID: \"6de6520e-beb1-455c-9db4-dce28d2490a6\") " pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.859076 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6de6520e-beb1-455c-9db4-dce28d2490a6-serving-cert\") pod \"route-controller-manager-7f8fd8bddb-5lnpw\" (UID: \"6de6520e-beb1-455c-9db4-dce28d2490a6\") " pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.865719 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfspw\" (UniqueName: \"kubernetes.io/projected/6de6520e-beb1-455c-9db4-dce28d2490a6-kube-api-access-kfspw\") pod \"route-controller-manager-7f8fd8bddb-5lnpw\" (UID: \"6de6520e-beb1-455c-9db4-dce28d2490a6\") " pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:47 crc kubenswrapper[4778]: I1205 16:00:47.996879 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:48 crc kubenswrapper[4778]: I1205 16:00:48.223500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" event={"ID":"5fe28cf2-0108-44f9-b364-90c846220e36","Type":"ContainerDied","Data":"d2ba4d8a9b065604cc1eda3c3cbf672fa88dc674f4dc0c2a6bb503b5e937aa78"} Dec 05 16:00:48 crc kubenswrapper[4778]: I1205 16:00:48.223553 4778 scope.go:117] "RemoveContainer" containerID="74734f95065baa768d637b46f723a9bd8258542654b9797e62935fd038d54a16" Dec 05 16:00:48 crc kubenswrapper[4778]: I1205 16:00:48.223677 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6" Dec 05 16:00:48 crc kubenswrapper[4778]: I1205 16:00:48.270031 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6"] Dec 05 16:00:48 crc kubenswrapper[4778]: I1205 16:00:48.275322 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4dbb9d8d-6drd6"] Dec 05 16:00:48 crc kubenswrapper[4778]: I1205 16:00:48.427562 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw"] Dec 05 16:00:49 crc kubenswrapper[4778]: I1205 16:00:49.233323 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" event={"ID":"6de6520e-beb1-455c-9db4-dce28d2490a6","Type":"ContainerStarted","Data":"6d84e99a90e90008ad4cc35637405d7bf7e58073b5a0c4558130868fddeb9b4f"} Dec 05 16:00:49 crc kubenswrapper[4778]: I1205 16:00:49.233402 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" event={"ID":"6de6520e-beb1-455c-9db4-dce28d2490a6","Type":"ContainerStarted","Data":"31be39733a2e50d9cdb6b25ba5fc7f83940cf3b0036986269a2168c8edaac5b9"} Dec 05 16:00:49 crc kubenswrapper[4778]: I1205 16:00:49.233708 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:49 crc kubenswrapper[4778]: I1205 16:00:49.237957 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" Dec 05 16:00:49 crc kubenswrapper[4778]: I1205 16:00:49.256385 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f8fd8bddb-5lnpw" podStartSLOduration=5.256340767 podStartE2EDuration="5.256340767s" podCreationTimestamp="2025-12-05 16:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:00:49.251549606 +0000 UTC m=+336.355345986" watchObservedRunningTime="2025-12-05 16:00:49.256340767 +0000 UTC m=+336.360137177" Dec 05 16:00:49 crc kubenswrapper[4778]: I1205 16:00:49.257329 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe28cf2-0108-44f9-b364-90c846220e36" path="/var/lib/kubelet/pods/5fe28cf2-0108-44f9-b364-90c846220e36/volumes" Dec 05 16:01:03 crc kubenswrapper[4778]: I1205 16:01:03.414614 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:01:03 crc kubenswrapper[4778]: I1205 16:01:03.415060 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:01:04 crc kubenswrapper[4778]: I1205 16:01:04.463411 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m"] Dec 05 16:01:04 crc kubenswrapper[4778]: I1205 16:01:04.463642 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" podUID="ebde0a6c-6b90-4bce-9174-324dd7ce48ad" containerName="controller-manager" containerID="cri-o://87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea" gracePeriod=30 Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.042516 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.212544 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-serving-cert\") pod \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.213203 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-client-ca\") pod \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.213479 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-proxy-ca-bundles\") pod \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.213615 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-config\") pod \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.213796 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9hm5\" (UniqueName: \"kubernetes.io/projected/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-kube-api-access-k9hm5\") pod \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\" (UID: \"ebde0a6c-6b90-4bce-9174-324dd7ce48ad\") " Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.214310 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-client-ca" (OuterVolumeSpecName: "client-ca") pod "ebde0a6c-6b90-4bce-9174-324dd7ce48ad" (UID: "ebde0a6c-6b90-4bce-9174-324dd7ce48ad"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.214706 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ebde0a6c-6b90-4bce-9174-324dd7ce48ad" (UID: "ebde0a6c-6b90-4bce-9174-324dd7ce48ad"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.218133 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-config" (OuterVolumeSpecName: "config") pod "ebde0a6c-6b90-4bce-9174-324dd7ce48ad" (UID: "ebde0a6c-6b90-4bce-9174-324dd7ce48ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.220196 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-kube-api-access-k9hm5" (OuterVolumeSpecName: "kube-api-access-k9hm5") pod "ebde0a6c-6b90-4bce-9174-324dd7ce48ad" (UID: "ebde0a6c-6b90-4bce-9174-324dd7ce48ad"). InnerVolumeSpecName "kube-api-access-k9hm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.221501 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ebde0a6c-6b90-4bce-9174-324dd7ce48ad" (UID: "ebde0a6c-6b90-4bce-9174-324dd7ce48ad"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.315806 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.315847 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.315857 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.315871 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.315881 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9hm5\" (UniqueName: \"kubernetes.io/projected/ebde0a6c-6b90-4bce-9174-324dd7ce48ad-kube-api-access-k9hm5\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.343386 4778 generic.go:334] "Generic (PLEG): container finished" podID="ebde0a6c-6b90-4bce-9174-324dd7ce48ad" containerID="87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea" exitCode=0 Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.343426 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" event={"ID":"ebde0a6c-6b90-4bce-9174-324dd7ce48ad","Type":"ContainerDied","Data":"87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea"} Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.343457 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" event={"ID":"ebde0a6c-6b90-4bce-9174-324dd7ce48ad","Type":"ContainerDied","Data":"4991c8acec1b8ad72bf5eadbb0a9c73f31a394d1cd643b3d5683f31da3911708"} Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.343489 4778 scope.go:117] "RemoveContainer" containerID="87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.343515 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.364966 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m"] Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.368380 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5cf665bbc7-7tl8m"] Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.372922 4778 scope.go:117] "RemoveContainer" containerID="87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea" Dec 05 16:01:05 crc kubenswrapper[4778]: E1205 16:01:05.373318 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea\": container with ID starting with 87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea not found: ID does not exist" containerID="87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.373401 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea"} err="failed to get container status \"87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea\": rpc error: code = NotFound desc = could not find container \"87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea\": container with ID starting with 87b263def95e0dbc1b175b95013de639a27524eb096361ede09ae78812d8bdea not found: ID does not exist" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.523280 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx"] Dec 05 16:01:05 crc kubenswrapper[4778]: E1205 16:01:05.523530 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebde0a6c-6b90-4bce-9174-324dd7ce48ad" containerName="controller-manager" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.523546 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebde0a6c-6b90-4bce-9174-324dd7ce48ad" containerName="controller-manager" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.523654 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebde0a6c-6b90-4bce-9174-324dd7ce48ad" containerName="controller-manager" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.524080 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.526005 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.526305 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.534635 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.534839 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.535046 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.535866 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.540032 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.541126 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx"] Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.621782 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0079c4d-baef-4215-aadf-5699407ccc81-config\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.621833 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0079c4d-baef-4215-aadf-5699407ccc81-proxy-ca-bundles\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.621871 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0079c4d-baef-4215-aadf-5699407ccc81-client-ca\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.621900 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbfrc\" (UniqueName: \"kubernetes.io/projected/c0079c4d-baef-4215-aadf-5699407ccc81-kube-api-access-gbfrc\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.621928 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0079c4d-baef-4215-aadf-5699407ccc81-serving-cert\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.722588 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbfrc\" (UniqueName: \"kubernetes.io/projected/c0079c4d-baef-4215-aadf-5699407ccc81-kube-api-access-gbfrc\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.722645 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0079c4d-baef-4215-aadf-5699407ccc81-serving-cert\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.722705 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0079c4d-baef-4215-aadf-5699407ccc81-config\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.722724 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0079c4d-baef-4215-aadf-5699407ccc81-proxy-ca-bundles\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.722750 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0079c4d-baef-4215-aadf-5699407ccc81-client-ca\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.723622 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0079c4d-baef-4215-aadf-5699407ccc81-client-ca\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.723704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0079c4d-baef-4215-aadf-5699407ccc81-proxy-ca-bundles\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.724522 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0079c4d-baef-4215-aadf-5699407ccc81-config\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.735353 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0079c4d-baef-4215-aadf-5699407ccc81-serving-cert\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.746125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbfrc\" (UniqueName: \"kubernetes.io/projected/c0079c4d-baef-4215-aadf-5699407ccc81-kube-api-access-gbfrc\") pod \"controller-manager-6cdcf4f87c-dskxx\" (UID: \"c0079c4d-baef-4215-aadf-5699407ccc81\") " pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:05 crc kubenswrapper[4778]: I1205 16:01:05.883326 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:06 crc kubenswrapper[4778]: I1205 16:01:06.310112 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx"] Dec 05 16:01:06 crc kubenswrapper[4778]: I1205 16:01:06.354737 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" event={"ID":"c0079c4d-baef-4215-aadf-5699407ccc81","Type":"ContainerStarted","Data":"58c76c623ae36388ec2cefc871cc6ed22226e424efd0629fdbbd723497ecdbc1"} Dec 05 16:01:07 crc kubenswrapper[4778]: I1205 16:01:07.261823 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebde0a6c-6b90-4bce-9174-324dd7ce48ad" path="/var/lib/kubelet/pods/ebde0a6c-6b90-4bce-9174-324dd7ce48ad/volumes" Dec 05 16:01:07 crc kubenswrapper[4778]: I1205 16:01:07.363227 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" event={"ID":"c0079c4d-baef-4215-aadf-5699407ccc81","Type":"ContainerStarted","Data":"2b574968dd90ffc93f257d9d595101b42df420fa539abb8f9df76e9f29c8d75e"} Dec 05 16:01:07 crc kubenswrapper[4778]: I1205 16:01:07.363594 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:07 crc kubenswrapper[4778]: I1205 16:01:07.367697 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" Dec 05 16:01:07 crc kubenswrapper[4778]: I1205 16:01:07.380761 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cdcf4f87c-dskxx" podStartSLOduration=3.380743068 podStartE2EDuration="3.380743068s" podCreationTimestamp="2025-12-05 16:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:01:07.378462926 +0000 UTC m=+354.482259316" watchObservedRunningTime="2025-12-05 16:01:07.380743068 +0000 UTC m=+354.484539458" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.341082 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tv4v9"] Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.342983 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.355340 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tv4v9"] Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.539068 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a5ae862-4302-40d3-ad12-f3a8057b42d5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.539126 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a5ae862-4302-40d3-ad12-f3a8057b42d5-registry-certificates\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.539155 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a5ae862-4302-40d3-ad12-f3a8057b42d5-bound-sa-token\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.539172 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a5ae862-4302-40d3-ad12-f3a8057b42d5-trusted-ca\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.539203 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a5ae862-4302-40d3-ad12-f3a8057b42d5-registry-tls\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.539222 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a5ae862-4302-40d3-ad12-f3a8057b42d5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.539260 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.539283 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6mnf\" (UniqueName: \"kubernetes.io/projected/7a5ae862-4302-40d3-ad12-f3a8057b42d5-kube-api-access-s6mnf\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.570496 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.640918 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6mnf\" (UniqueName: \"kubernetes.io/projected/7a5ae862-4302-40d3-ad12-f3a8057b42d5-kube-api-access-s6mnf\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.641000 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a5ae862-4302-40d3-ad12-f3a8057b42d5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.641070 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a5ae862-4302-40d3-ad12-f3a8057b42d5-registry-certificates\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.641101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a5ae862-4302-40d3-ad12-f3a8057b42d5-bound-sa-token\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.641133 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a5ae862-4302-40d3-ad12-f3a8057b42d5-trusted-ca\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.641192 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a5ae862-4302-40d3-ad12-f3a8057b42d5-registry-tls\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.641228 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a5ae862-4302-40d3-ad12-f3a8057b42d5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.642463 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a5ae862-4302-40d3-ad12-f3a8057b42d5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.642997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a5ae862-4302-40d3-ad12-f3a8057b42d5-trusted-ca\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.643012 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a5ae862-4302-40d3-ad12-f3a8057b42d5-registry-certificates\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.649239 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a5ae862-4302-40d3-ad12-f3a8057b42d5-registry-tls\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.658501 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a5ae862-4302-40d3-ad12-f3a8057b42d5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.668050 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a5ae862-4302-40d3-ad12-f3a8057b42d5-bound-sa-token\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.674197 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6mnf\" (UniqueName: \"kubernetes.io/projected/7a5ae862-4302-40d3-ad12-f3a8057b42d5-kube-api-access-s6mnf\") pod \"image-registry-66df7c8f76-tv4v9\" (UID: \"7a5ae862-4302-40d3-ad12-f3a8057b42d5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:14 crc kubenswrapper[4778]: I1205 16:01:14.971138 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:15 crc kubenswrapper[4778]: I1205 16:01:15.463286 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tv4v9"] Dec 05 16:01:16 crc kubenswrapper[4778]: I1205 16:01:16.420928 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" event={"ID":"7a5ae862-4302-40d3-ad12-f3a8057b42d5","Type":"ContainerStarted","Data":"44759f5cf585454b1918cd1ad89c1f666c9dc255675dd60e7d4fb3cf6ef36ca5"} Dec 05 16:01:16 crc kubenswrapper[4778]: I1205 16:01:16.421213 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" event={"ID":"7a5ae862-4302-40d3-ad12-f3a8057b42d5","Type":"ContainerStarted","Data":"7677e4d35cfd4618042a410e5ffdaab80eb481a8a4e8cde66d79e5a21f2e8b73"} Dec 05 16:01:16 crc kubenswrapper[4778]: I1205 16:01:16.421242 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:16 crc kubenswrapper[4778]: I1205 16:01:16.440567 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" podStartSLOduration=2.440534922 podStartE2EDuration="2.440534922s" podCreationTimestamp="2025-12-05 16:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:01:16.438399514 +0000 UTC m=+363.542195934" watchObservedRunningTime="2025-12-05 16:01:16.440534922 +0000 UTC m=+363.544331342" Dec 05 16:01:33 crc kubenswrapper[4778]: I1205 16:01:33.415021 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:01:33 crc kubenswrapper[4778]: I1205 16:01:33.415472 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:01:33 crc kubenswrapper[4778]: I1205 16:01:33.415511 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:01:33 crc kubenswrapper[4778]: I1205 16:01:33.416001 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d94c7d5a3c87642bb8001766b12e02cd4c800446b73b1de7ad07648c1824c6e"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:01:33 crc kubenswrapper[4778]: I1205 16:01:33.416046 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://3d94c7d5a3c87642bb8001766b12e02cd4c800446b73b1de7ad07648c1824c6e" gracePeriod=600 Dec 05 16:01:33 crc kubenswrapper[4778]: I1205 16:01:33.537252 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="3d94c7d5a3c87642bb8001766b12e02cd4c800446b73b1de7ad07648c1824c6e" exitCode=0 Dec 05 16:01:33 crc kubenswrapper[4778]: I1205 16:01:33.537300 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"3d94c7d5a3c87642bb8001766b12e02cd4c800446b73b1de7ad07648c1824c6e"} Dec 05 16:01:33 crc kubenswrapper[4778]: I1205 16:01:33.537412 4778 scope.go:117] "RemoveContainer" containerID="239aa80152e1eef6e32fea95c700bdcf84d8fd3b6cea09325d163676b524cb07" Dec 05 16:01:34 crc kubenswrapper[4778]: I1205 16:01:34.547833 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"bf4aa3e6482ce3f53fbdeb198457c65bd5fd850856e011867fd9a803a7e3ab35"} Dec 05 16:01:34 crc kubenswrapper[4778]: I1205 16:01:34.976512 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-tv4v9" Dec 05 16:01:35 crc kubenswrapper[4778]: I1205 16:01:35.046098 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m6452"] Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.629920 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-plzrl"] Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.630845 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-plzrl" podUID="e5267d1d-ec1f-461d-acdc-57303aac7015" containerName="registry-server" containerID="cri-o://4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17" gracePeriod=30 Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.639951 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wkk25"] Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.640180 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wkk25" podUID="8cff4721-de89-49fa-9f19-682ec8ae8e64" containerName="registry-server" containerID="cri-o://54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae" gracePeriod=30 Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.651301 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ptlw8"] Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.651565 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" podUID="66a3882a-e9bc-40d4-b51f-e47d9354f53a" containerName="marketplace-operator" containerID="cri-o://1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9" gracePeriod=30 Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.659270 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwxnr"] Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.659551 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gwxnr" podUID="70c6979b-453d-49a4-889e-e46eff9af778" containerName="registry-server" containerID="cri-o://b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6" gracePeriod=30 Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.666879 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tlnh4"] Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.667443 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tlnh4" podUID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" containerName="registry-server" containerID="cri-o://6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74" gracePeriod=30 Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.678540 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lc9sh"] Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.679309 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.691744 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lc9sh"] Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.717830 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cgx2\" (UniqueName: \"kubernetes.io/projected/bddf2447-16af-4f91-ab3b-1d910c27027a-kube-api-access-8cgx2\") pod \"marketplace-operator-79b997595-lc9sh\" (UID: \"bddf2447-16af-4f91-ab3b-1d910c27027a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.717875 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bddf2447-16af-4f91-ab3b-1d910c27027a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lc9sh\" (UID: \"bddf2447-16af-4f91-ab3b-1d910c27027a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.717900 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bddf2447-16af-4f91-ab3b-1d910c27027a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lc9sh\" (UID: \"bddf2447-16af-4f91-ab3b-1d910c27027a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.818870 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cgx2\" (UniqueName: \"kubernetes.io/projected/bddf2447-16af-4f91-ab3b-1d910c27027a-kube-api-access-8cgx2\") pod \"marketplace-operator-79b997595-lc9sh\" (UID: \"bddf2447-16af-4f91-ab3b-1d910c27027a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.819212 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bddf2447-16af-4f91-ab3b-1d910c27027a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lc9sh\" (UID: \"bddf2447-16af-4f91-ab3b-1d910c27027a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.819248 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bddf2447-16af-4f91-ab3b-1d910c27027a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lc9sh\" (UID: \"bddf2447-16af-4f91-ab3b-1d910c27027a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.821305 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bddf2447-16af-4f91-ab3b-1d910c27027a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lc9sh\" (UID: \"bddf2447-16af-4f91-ab3b-1d910c27027a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.832870 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bddf2447-16af-4f91-ab3b-1d910c27027a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lc9sh\" (UID: \"bddf2447-16af-4f91-ab3b-1d910c27027a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:52 crc kubenswrapper[4778]: I1205 16:01:52.841226 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cgx2\" (UniqueName: \"kubernetes.io/projected/bddf2447-16af-4f91-ab3b-1d910c27027a-kube-api-access-8cgx2\") pod \"marketplace-operator-79b997595-lc9sh\" (UID: \"bddf2447-16af-4f91-ab3b-1d910c27027a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.101645 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.112755 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plzrl" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.121707 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.138633 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wkk25" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.140619 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.164199 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.224455 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb1076bb-639d-42e5-ab8c-d13eb121cc95-catalog-content\") pod \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\" (UID: \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.224725 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66a3882a-e9bc-40d4-b51f-e47d9354f53a-marketplace-trusted-ca\") pod \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\" (UID: \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.224833 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66a3882a-e9bc-40d4-b51f-e47d9354f53a-marketplace-operator-metrics\") pod \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\" (UID: \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.224978 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5267d1d-ec1f-461d-acdc-57303aac7015-utilities\") pod \"e5267d1d-ec1f-461d-acdc-57303aac7015\" (UID: \"e5267d1d-ec1f-461d-acdc-57303aac7015\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.225078 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cff4721-de89-49fa-9f19-682ec8ae8e64-utilities\") pod \"8cff4721-de89-49fa-9f19-682ec8ae8e64\" (UID: \"8cff4721-de89-49fa-9f19-682ec8ae8e64\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.225174 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cff4721-de89-49fa-9f19-682ec8ae8e64-catalog-content\") pod \"8cff4721-de89-49fa-9f19-682ec8ae8e64\" (UID: \"8cff4721-de89-49fa-9f19-682ec8ae8e64\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.225279 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xvk7\" (UniqueName: \"kubernetes.io/projected/e5267d1d-ec1f-461d-acdc-57303aac7015-kube-api-access-2xvk7\") pod \"e5267d1d-ec1f-461d-acdc-57303aac7015\" (UID: \"e5267d1d-ec1f-461d-acdc-57303aac7015\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.225403 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c6979b-453d-49a4-889e-e46eff9af778-utilities\") pod \"70c6979b-453d-49a4-889e-e46eff9af778\" (UID: \"70c6979b-453d-49a4-889e-e46eff9af778\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.225527 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c6979b-453d-49a4-889e-e46eff9af778-catalog-content\") pod \"70c6979b-453d-49a4-889e-e46eff9af778\" (UID: \"70c6979b-453d-49a4-889e-e46eff9af778\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.225626 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2fp9\" (UniqueName: \"kubernetes.io/projected/eb1076bb-639d-42e5-ab8c-d13eb121cc95-kube-api-access-f2fp9\") pod \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\" (UID: \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.225743 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb1076bb-639d-42e5-ab8c-d13eb121cc95-utilities\") pod \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\" (UID: \"eb1076bb-639d-42e5-ab8c-d13eb121cc95\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.225857 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvjct\" (UniqueName: \"kubernetes.io/projected/66a3882a-e9bc-40d4-b51f-e47d9354f53a-kube-api-access-nvjct\") pod \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\" (UID: \"66a3882a-e9bc-40d4-b51f-e47d9354f53a\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.226022 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knrld\" (UniqueName: \"kubernetes.io/projected/8cff4721-de89-49fa-9f19-682ec8ae8e64-kube-api-access-knrld\") pod \"8cff4721-de89-49fa-9f19-682ec8ae8e64\" (UID: \"8cff4721-de89-49fa-9f19-682ec8ae8e64\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.226168 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5267d1d-ec1f-461d-acdc-57303aac7015-catalog-content\") pod \"e5267d1d-ec1f-461d-acdc-57303aac7015\" (UID: \"e5267d1d-ec1f-461d-acdc-57303aac7015\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.226463 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwm9t\" (UniqueName: \"kubernetes.io/projected/70c6979b-453d-49a4-889e-e46eff9af778-kube-api-access-qwm9t\") pod \"70c6979b-453d-49a4-889e-e46eff9af778\" (UID: \"70c6979b-453d-49a4-889e-e46eff9af778\") " Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.229240 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c6979b-453d-49a4-889e-e46eff9af778-utilities" (OuterVolumeSpecName: "utilities") pod "70c6979b-453d-49a4-889e-e46eff9af778" (UID: "70c6979b-453d-49a4-889e-e46eff9af778"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.230210 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cff4721-de89-49fa-9f19-682ec8ae8e64-utilities" (OuterVolumeSpecName: "utilities") pod "8cff4721-de89-49fa-9f19-682ec8ae8e64" (UID: "8cff4721-de89-49fa-9f19-682ec8ae8e64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.230428 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5267d1d-ec1f-461d-acdc-57303aac7015-utilities" (OuterVolumeSpecName: "utilities") pod "e5267d1d-ec1f-461d-acdc-57303aac7015" (UID: "e5267d1d-ec1f-461d-acdc-57303aac7015"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.232714 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66a3882a-e9bc-40d4-b51f-e47d9354f53a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "66a3882a-e9bc-40d4-b51f-e47d9354f53a" (UID: "66a3882a-e9bc-40d4-b51f-e47d9354f53a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.234795 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb1076bb-639d-42e5-ab8c-d13eb121cc95-utilities" (OuterVolumeSpecName: "utilities") pod "eb1076bb-639d-42e5-ab8c-d13eb121cc95" (UID: "eb1076bb-639d-42e5-ab8c-d13eb121cc95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.235161 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66a3882a-e9bc-40d4-b51f-e47d9354f53a-kube-api-access-nvjct" (OuterVolumeSpecName: "kube-api-access-nvjct") pod "66a3882a-e9bc-40d4-b51f-e47d9354f53a" (UID: "66a3882a-e9bc-40d4-b51f-e47d9354f53a"). InnerVolumeSpecName "kube-api-access-nvjct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.238290 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a3882a-e9bc-40d4-b51f-e47d9354f53a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "66a3882a-e9bc-40d4-b51f-e47d9354f53a" (UID: "66a3882a-e9bc-40d4-b51f-e47d9354f53a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.238657 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb1076bb-639d-42e5-ab8c-d13eb121cc95-kube-api-access-f2fp9" (OuterVolumeSpecName: "kube-api-access-f2fp9") pod "eb1076bb-639d-42e5-ab8c-d13eb121cc95" (UID: "eb1076bb-639d-42e5-ab8c-d13eb121cc95"). InnerVolumeSpecName "kube-api-access-f2fp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.243815 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cff4721-de89-49fa-9f19-682ec8ae8e64-kube-api-access-knrld" (OuterVolumeSpecName: "kube-api-access-knrld") pod "8cff4721-de89-49fa-9f19-682ec8ae8e64" (UID: "8cff4721-de89-49fa-9f19-682ec8ae8e64"). InnerVolumeSpecName "kube-api-access-knrld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.243832 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c6979b-453d-49a4-889e-e46eff9af778-kube-api-access-qwm9t" (OuterVolumeSpecName: "kube-api-access-qwm9t") pod "70c6979b-453d-49a4-889e-e46eff9af778" (UID: "70c6979b-453d-49a4-889e-e46eff9af778"). InnerVolumeSpecName "kube-api-access-qwm9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.245206 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvjct\" (UniqueName: \"kubernetes.io/projected/66a3882a-e9bc-40d4-b51f-e47d9354f53a-kube-api-access-nvjct\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.245227 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66a3882a-e9bc-40d4-b51f-e47d9354f53a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.245236 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66a3882a-e9bc-40d4-b51f-e47d9354f53a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.246078 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5267d1d-ec1f-461d-acdc-57303aac7015-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.246103 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cff4721-de89-49fa-9f19-682ec8ae8e64-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.246117 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c6979b-453d-49a4-889e-e46eff9af778-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.246130 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2fp9\" (UniqueName: \"kubernetes.io/projected/eb1076bb-639d-42e5-ab8c-d13eb121cc95-kube-api-access-f2fp9\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.246602 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb1076bb-639d-42e5-ab8c-d13eb121cc95-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.248294 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5267d1d-ec1f-461d-acdc-57303aac7015-kube-api-access-2xvk7" (OuterVolumeSpecName: "kube-api-access-2xvk7") pod "e5267d1d-ec1f-461d-acdc-57303aac7015" (UID: "e5267d1d-ec1f-461d-acdc-57303aac7015"). InnerVolumeSpecName "kube-api-access-2xvk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.259712 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c6979b-453d-49a4-889e-e46eff9af778-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70c6979b-453d-49a4-889e-e46eff9af778" (UID: "70c6979b-453d-49a4-889e-e46eff9af778"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.298134 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5267d1d-ec1f-461d-acdc-57303aac7015-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5267d1d-ec1f-461d-acdc-57303aac7015" (UID: "e5267d1d-ec1f-461d-acdc-57303aac7015"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.299676 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cff4721-de89-49fa-9f19-682ec8ae8e64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8cff4721-de89-49fa-9f19-682ec8ae8e64" (UID: "8cff4721-de89-49fa-9f19-682ec8ae8e64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.343196 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb1076bb-639d-42e5-ab8c-d13eb121cc95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb1076bb-639d-42e5-ab8c-d13eb121cc95" (UID: "eb1076bb-639d-42e5-ab8c-d13eb121cc95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.347874 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb1076bb-639d-42e5-ab8c-d13eb121cc95-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.347916 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cff4721-de89-49fa-9f19-682ec8ae8e64-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.347927 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xvk7\" (UniqueName: \"kubernetes.io/projected/e5267d1d-ec1f-461d-acdc-57303aac7015-kube-api-access-2xvk7\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.347939 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c6979b-453d-49a4-889e-e46eff9af778-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.347948 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knrld\" (UniqueName: \"kubernetes.io/projected/8cff4721-de89-49fa-9f19-682ec8ae8e64-kube-api-access-knrld\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.347961 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5267d1d-ec1f-461d-acdc-57303aac7015-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.347972 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwm9t\" (UniqueName: \"kubernetes.io/projected/70c6979b-453d-49a4-889e-e46eff9af778-kube-api-access-qwm9t\") on node \"crc\" DevicePath \"\"" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.574141 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lc9sh"] Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.670912 4778 generic.go:334] "Generic (PLEG): container finished" podID="70c6979b-453d-49a4-889e-e46eff9af778" containerID="b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6" exitCode=0 Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.670983 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwxnr" event={"ID":"70c6979b-453d-49a4-889e-e46eff9af778","Type":"ContainerDied","Data":"b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6"} Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.671018 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwxnr" event={"ID":"70c6979b-453d-49a4-889e-e46eff9af778","Type":"ContainerDied","Data":"f6cd4ce76475c37356ae9abd3e7ce63b2fb7a966480f805a2e7e92083fc16c7c"} Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.671056 4778 scope.go:117] "RemoveContainer" containerID="b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.671221 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwxnr" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.677163 4778 generic.go:334] "Generic (PLEG): container finished" podID="66a3882a-e9bc-40d4-b51f-e47d9354f53a" containerID="1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9" exitCode=0 Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.677244 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" event={"ID":"66a3882a-e9bc-40d4-b51f-e47d9354f53a","Type":"ContainerDied","Data":"1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9"} Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.677274 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" event={"ID":"66a3882a-e9bc-40d4-b51f-e47d9354f53a","Type":"ContainerDied","Data":"85be95007696f2605993739d0fbe4ffbfc6892c36cc9a4e0af09cbafe93fe2a6"} Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.677275 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ptlw8" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.679597 4778 generic.go:334] "Generic (PLEG): container finished" podID="e5267d1d-ec1f-461d-acdc-57303aac7015" containerID="4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17" exitCode=0 Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.679684 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plzrl" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.679669 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plzrl" event={"ID":"e5267d1d-ec1f-461d-acdc-57303aac7015","Type":"ContainerDied","Data":"4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17"} Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.679959 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plzrl" event={"ID":"e5267d1d-ec1f-461d-acdc-57303aac7015","Type":"ContainerDied","Data":"4a158d8a12d5ca86d604a7718eb447da323a4fd559b55c8120284263891525ce"} Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.682832 4778 generic.go:334] "Generic (PLEG): container finished" podID="8cff4721-de89-49fa-9f19-682ec8ae8e64" containerID="54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae" exitCode=0 Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.682891 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk25" event={"ID":"8cff4721-de89-49fa-9f19-682ec8ae8e64","Type":"ContainerDied","Data":"54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae"} Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.682919 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk25" event={"ID":"8cff4721-de89-49fa-9f19-682ec8ae8e64","Type":"ContainerDied","Data":"6d0911295a9a1b3eafc0d08c9a9954d0fa4d3f24bcf28af09a1b4ea434a4ec29"} Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.683046 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wkk25" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.695160 4778 generic.go:334] "Generic (PLEG): container finished" podID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" containerID="6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74" exitCode=0 Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.695344 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tlnh4" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.695298 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlnh4" event={"ID":"eb1076bb-639d-42e5-ab8c-d13eb121cc95","Type":"ContainerDied","Data":"6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74"} Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.695433 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlnh4" event={"ID":"eb1076bb-639d-42e5-ab8c-d13eb121cc95","Type":"ContainerDied","Data":"31527b7d0afc73b65dd40049eee8fecd8f623a096d1bc3e1cad6cf72585dcf50"} Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.697623 4778 scope.go:117] "RemoveContainer" containerID="77ae6602820cc302573f8db0f1e03119b2b0ad85873854d51cfcfdcdb3c81250" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.698359 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" event={"ID":"bddf2447-16af-4f91-ab3b-1d910c27027a","Type":"ContainerStarted","Data":"34aa40b85f4b99f367262e504419550bcbb31724f15564d0b3a6c7b5afa6cef8"} Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.714818 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ptlw8"] Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.728626 4778 scope.go:117] "RemoveContainer" containerID="005c15e3d7a26871af424c95767ca68ff3cfeab4c2b4971d856fbb5dcfa4796b" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.738846 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ptlw8"] Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.745582 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwxnr"] Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.751067 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwxnr"] Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.758450 4778 scope.go:117] "RemoveContainer" containerID="b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.759323 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6\": container with ID starting with b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6 not found: ID does not exist" containerID="b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.759423 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6"} err="failed to get container status \"b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6\": rpc error: code = NotFound desc = could not find container \"b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6\": container with ID starting with b331ae24e50777678f279738085791c712eaa3881f6363f8f1dcad6a08d6f7f6 not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.759457 4778 scope.go:117] "RemoveContainer" containerID="77ae6602820cc302573f8db0f1e03119b2b0ad85873854d51cfcfdcdb3c81250" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.760631 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77ae6602820cc302573f8db0f1e03119b2b0ad85873854d51cfcfdcdb3c81250\": container with ID starting with 77ae6602820cc302573f8db0f1e03119b2b0ad85873854d51cfcfdcdb3c81250 not found: ID does not exist" containerID="77ae6602820cc302573f8db0f1e03119b2b0ad85873854d51cfcfdcdb3c81250" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.760659 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ae6602820cc302573f8db0f1e03119b2b0ad85873854d51cfcfdcdb3c81250"} err="failed to get container status \"77ae6602820cc302573f8db0f1e03119b2b0ad85873854d51cfcfdcdb3c81250\": rpc error: code = NotFound desc = could not find container \"77ae6602820cc302573f8db0f1e03119b2b0ad85873854d51cfcfdcdb3c81250\": container with ID starting with 77ae6602820cc302573f8db0f1e03119b2b0ad85873854d51cfcfdcdb3c81250 not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.760677 4778 scope.go:117] "RemoveContainer" containerID="005c15e3d7a26871af424c95767ca68ff3cfeab4c2b4971d856fbb5dcfa4796b" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.760913 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005c15e3d7a26871af424c95767ca68ff3cfeab4c2b4971d856fbb5dcfa4796b\": container with ID starting with 005c15e3d7a26871af424c95767ca68ff3cfeab4c2b4971d856fbb5dcfa4796b not found: ID does not exist" containerID="005c15e3d7a26871af424c95767ca68ff3cfeab4c2b4971d856fbb5dcfa4796b" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.760941 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005c15e3d7a26871af424c95767ca68ff3cfeab4c2b4971d856fbb5dcfa4796b"} err="failed to get container status \"005c15e3d7a26871af424c95767ca68ff3cfeab4c2b4971d856fbb5dcfa4796b\": rpc error: code = NotFound desc = could not find container \"005c15e3d7a26871af424c95767ca68ff3cfeab4c2b4971d856fbb5dcfa4796b\": container with ID starting with 005c15e3d7a26871af424c95767ca68ff3cfeab4c2b4971d856fbb5dcfa4796b not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.760954 4778 scope.go:117] "RemoveContainer" containerID="1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.761465 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-plzrl"] Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.770435 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-plzrl"] Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.770491 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tlnh4"] Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.773439 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tlnh4"] Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.777207 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wkk25"] Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.781989 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wkk25"] Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.804750 4778 scope.go:117] "RemoveContainer" containerID="e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.821306 4778 scope.go:117] "RemoveContainer" containerID="1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.821692 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9\": container with ID starting with 1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9 not found: ID does not exist" containerID="1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.821721 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9"} err="failed to get container status \"1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9\": rpc error: code = NotFound desc = could not find container \"1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9\": container with ID starting with 1d6c4ff24ccbec9ab7a01b39b5f0e7a64e4a47ca88678f547bcc6c71f7e58ad9 not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.821742 4778 scope.go:117] "RemoveContainer" containerID="e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.821945 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9\": container with ID starting with e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9 not found: ID does not exist" containerID="e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.821963 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9"} err="failed to get container status \"e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9\": rpc error: code = NotFound desc = could not find container \"e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9\": container with ID starting with e57f7e50d63c3a275cc572d062ee1b02714bd28baf46f43e19e9417397d0d9a9 not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.821974 4778 scope.go:117] "RemoveContainer" containerID="4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.837189 4778 scope.go:117] "RemoveContainer" containerID="2ec4ca6a996812bc32c022260faddbf1df943351747bde3aee192bc0d228f623" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.849040 4778 scope.go:117] "RemoveContainer" containerID="6147c8dea248235058419b3d71d2b4870d90790749a9eca528192dad8b32c7e2" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.873027 4778 scope.go:117] "RemoveContainer" containerID="4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.873617 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17\": container with ID starting with 4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17 not found: ID does not exist" containerID="4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.873668 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17"} err="failed to get container status \"4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17\": rpc error: code = NotFound desc = could not find container \"4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17\": container with ID starting with 4709f692c5141c328beb65d94f5b6c2845c41904c5ebf3825cadf52d805b7f17 not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.873703 4778 scope.go:117] "RemoveContainer" containerID="2ec4ca6a996812bc32c022260faddbf1df943351747bde3aee192bc0d228f623" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.874131 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ec4ca6a996812bc32c022260faddbf1df943351747bde3aee192bc0d228f623\": container with ID starting with 2ec4ca6a996812bc32c022260faddbf1df943351747bde3aee192bc0d228f623 not found: ID does not exist" containerID="2ec4ca6a996812bc32c022260faddbf1df943351747bde3aee192bc0d228f623" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.874155 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ec4ca6a996812bc32c022260faddbf1df943351747bde3aee192bc0d228f623"} err="failed to get container status \"2ec4ca6a996812bc32c022260faddbf1df943351747bde3aee192bc0d228f623\": rpc error: code = NotFound desc = could not find container \"2ec4ca6a996812bc32c022260faddbf1df943351747bde3aee192bc0d228f623\": container with ID starting with 2ec4ca6a996812bc32c022260faddbf1df943351747bde3aee192bc0d228f623 not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.874169 4778 scope.go:117] "RemoveContainer" containerID="6147c8dea248235058419b3d71d2b4870d90790749a9eca528192dad8b32c7e2" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.874675 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6147c8dea248235058419b3d71d2b4870d90790749a9eca528192dad8b32c7e2\": container with ID starting with 6147c8dea248235058419b3d71d2b4870d90790749a9eca528192dad8b32c7e2 not found: ID does not exist" containerID="6147c8dea248235058419b3d71d2b4870d90790749a9eca528192dad8b32c7e2" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.874724 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6147c8dea248235058419b3d71d2b4870d90790749a9eca528192dad8b32c7e2"} err="failed to get container status \"6147c8dea248235058419b3d71d2b4870d90790749a9eca528192dad8b32c7e2\": rpc error: code = NotFound desc = could not find container \"6147c8dea248235058419b3d71d2b4870d90790749a9eca528192dad8b32c7e2\": container with ID starting with 6147c8dea248235058419b3d71d2b4870d90790749a9eca528192dad8b32c7e2 not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.874747 4778 scope.go:117] "RemoveContainer" containerID="54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.888858 4778 scope.go:117] "RemoveContainer" containerID="dbdd53eaaaf60dcfecd0142cbcf03513d4ab3d32d7ecf2907c470c5b737d5fa9" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.903612 4778 scope.go:117] "RemoveContainer" containerID="8c01060dabd996d45f7ca5581b1dc952b9bfb3fecd2065f0017193b2247f3afa" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.916051 4778 scope.go:117] "RemoveContainer" containerID="54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.916420 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae\": container with ID starting with 54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae not found: ID does not exist" containerID="54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.916449 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae"} err="failed to get container status \"54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae\": rpc error: code = NotFound desc = could not find container \"54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae\": container with ID starting with 54530fa3217beb917d66ece7ee14d8588d126bc27f49493000205dd2c5fa64ae not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.916471 4778 scope.go:117] "RemoveContainer" containerID="dbdd53eaaaf60dcfecd0142cbcf03513d4ab3d32d7ecf2907c470c5b737d5fa9" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.916957 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbdd53eaaaf60dcfecd0142cbcf03513d4ab3d32d7ecf2907c470c5b737d5fa9\": container with ID starting with dbdd53eaaaf60dcfecd0142cbcf03513d4ab3d32d7ecf2907c470c5b737d5fa9 not found: ID does not exist" containerID="dbdd53eaaaf60dcfecd0142cbcf03513d4ab3d32d7ecf2907c470c5b737d5fa9" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.916993 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbdd53eaaaf60dcfecd0142cbcf03513d4ab3d32d7ecf2907c470c5b737d5fa9"} err="failed to get container status \"dbdd53eaaaf60dcfecd0142cbcf03513d4ab3d32d7ecf2907c470c5b737d5fa9\": rpc error: code = NotFound desc = could not find container \"dbdd53eaaaf60dcfecd0142cbcf03513d4ab3d32d7ecf2907c470c5b737d5fa9\": container with ID starting with dbdd53eaaaf60dcfecd0142cbcf03513d4ab3d32d7ecf2907c470c5b737d5fa9 not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.917055 4778 scope.go:117] "RemoveContainer" containerID="8c01060dabd996d45f7ca5581b1dc952b9bfb3fecd2065f0017193b2247f3afa" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.917427 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c01060dabd996d45f7ca5581b1dc952b9bfb3fecd2065f0017193b2247f3afa\": container with ID starting with 8c01060dabd996d45f7ca5581b1dc952b9bfb3fecd2065f0017193b2247f3afa not found: ID does not exist" containerID="8c01060dabd996d45f7ca5581b1dc952b9bfb3fecd2065f0017193b2247f3afa" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.917449 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c01060dabd996d45f7ca5581b1dc952b9bfb3fecd2065f0017193b2247f3afa"} err="failed to get container status \"8c01060dabd996d45f7ca5581b1dc952b9bfb3fecd2065f0017193b2247f3afa\": rpc error: code = NotFound desc = could not find container \"8c01060dabd996d45f7ca5581b1dc952b9bfb3fecd2065f0017193b2247f3afa\": container with ID starting with 8c01060dabd996d45f7ca5581b1dc952b9bfb3fecd2065f0017193b2247f3afa not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.917463 4778 scope.go:117] "RemoveContainer" containerID="6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.932184 4778 scope.go:117] "RemoveContainer" containerID="b95e816204bcbcf9dad9e4cd1d127c73954e4e6161be032395b1722c8fd0087c" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.958837 4778 scope.go:117] "RemoveContainer" containerID="dda4b4412b2b2f999775b70a53aeb23388087148d3c0047f1f6d8c61a6aa3b79" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.979609 4778 scope.go:117] "RemoveContainer" containerID="6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.979990 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74\": container with ID starting with 6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74 not found: ID does not exist" containerID="6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.980019 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74"} err="failed to get container status \"6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74\": rpc error: code = NotFound desc = could not find container \"6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74\": container with ID starting with 6cfcafdd9b20507b7bf61e2c43d6569a5f17dcb7d7446b38bd65c6c4367f9c74 not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.980050 4778 scope.go:117] "RemoveContainer" containerID="b95e816204bcbcf9dad9e4cd1d127c73954e4e6161be032395b1722c8fd0087c" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.980457 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b95e816204bcbcf9dad9e4cd1d127c73954e4e6161be032395b1722c8fd0087c\": container with ID starting with b95e816204bcbcf9dad9e4cd1d127c73954e4e6161be032395b1722c8fd0087c not found: ID does not exist" containerID="b95e816204bcbcf9dad9e4cd1d127c73954e4e6161be032395b1722c8fd0087c" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.980487 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95e816204bcbcf9dad9e4cd1d127c73954e4e6161be032395b1722c8fd0087c"} err="failed to get container status \"b95e816204bcbcf9dad9e4cd1d127c73954e4e6161be032395b1722c8fd0087c\": rpc error: code = NotFound desc = could not find container \"b95e816204bcbcf9dad9e4cd1d127c73954e4e6161be032395b1722c8fd0087c\": container with ID starting with b95e816204bcbcf9dad9e4cd1d127c73954e4e6161be032395b1722c8fd0087c not found: ID does not exist" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.980505 4778 scope.go:117] "RemoveContainer" containerID="dda4b4412b2b2f999775b70a53aeb23388087148d3c0047f1f6d8c61a6aa3b79" Dec 05 16:01:53 crc kubenswrapper[4778]: E1205 16:01:53.980816 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda4b4412b2b2f999775b70a53aeb23388087148d3c0047f1f6d8c61a6aa3b79\": container with ID starting with dda4b4412b2b2f999775b70a53aeb23388087148d3c0047f1f6d8c61a6aa3b79 not found: ID does not exist" containerID="dda4b4412b2b2f999775b70a53aeb23388087148d3c0047f1f6d8c61a6aa3b79" Dec 05 16:01:53 crc kubenswrapper[4778]: I1205 16:01:53.980870 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda4b4412b2b2f999775b70a53aeb23388087148d3c0047f1f6d8c61a6aa3b79"} err="failed to get container status \"dda4b4412b2b2f999775b70a53aeb23388087148d3c0047f1f6d8c61a6aa3b79\": rpc error: code = NotFound desc = could not find container \"dda4b4412b2b2f999775b70a53aeb23388087148d3c0047f1f6d8c61a6aa3b79\": container with ID starting with dda4b4412b2b2f999775b70a53aeb23388087148d3c0047f1f6d8c61a6aa3b79 not found: ID does not exist" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.705997 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" event={"ID":"bddf2447-16af-4f91-ab3b-1d910c27027a","Type":"ContainerStarted","Data":"7f66bb9f7f71608f83401e9497fd188d4205a75ef0ad23bb14e69fdad0d901e8"} Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.706257 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.708809 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.727695 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lc9sh" podStartSLOduration=2.7276650289999997 podStartE2EDuration="2.727665029s" podCreationTimestamp="2025-12-05 16:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:01:54.719615713 +0000 UTC m=+401.823412123" watchObservedRunningTime="2025-12-05 16:01:54.727665029 +0000 UTC m=+401.831461429" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839194 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8kfvb"] Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839514 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5267d1d-ec1f-461d-acdc-57303aac7015" containerName="registry-server" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839537 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5267d1d-ec1f-461d-acdc-57303aac7015" containerName="registry-server" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839558 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cff4721-de89-49fa-9f19-682ec8ae8e64" containerName="extract-content" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839570 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cff4721-de89-49fa-9f19-682ec8ae8e64" containerName="extract-content" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839586 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c6979b-453d-49a4-889e-e46eff9af778" containerName="registry-server" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839597 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c6979b-453d-49a4-889e-e46eff9af778" containerName="registry-server" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839611 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cff4721-de89-49fa-9f19-682ec8ae8e64" containerName="registry-server" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839622 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cff4721-de89-49fa-9f19-682ec8ae8e64" containerName="registry-server" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839641 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a3882a-e9bc-40d4-b51f-e47d9354f53a" containerName="marketplace-operator" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839652 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a3882a-e9bc-40d4-b51f-e47d9354f53a" containerName="marketplace-operator" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839670 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5267d1d-ec1f-461d-acdc-57303aac7015" containerName="extract-utilities" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839680 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5267d1d-ec1f-461d-acdc-57303aac7015" containerName="extract-utilities" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839698 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" containerName="extract-utilities" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839709 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" containerName="extract-utilities" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839727 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" containerName="extract-content" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839738 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" containerName="extract-content" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839756 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a3882a-e9bc-40d4-b51f-e47d9354f53a" containerName="marketplace-operator" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839767 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a3882a-e9bc-40d4-b51f-e47d9354f53a" containerName="marketplace-operator" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839780 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" containerName="registry-server" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839791 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" containerName="registry-server" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839811 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cff4721-de89-49fa-9f19-682ec8ae8e64" containerName="extract-utilities" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839822 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cff4721-de89-49fa-9f19-682ec8ae8e64" containerName="extract-utilities" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839836 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c6979b-453d-49a4-889e-e46eff9af778" containerName="extract-utilities" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839846 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c6979b-453d-49a4-889e-e46eff9af778" containerName="extract-utilities" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839862 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5267d1d-ec1f-461d-acdc-57303aac7015" containerName="extract-content" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839872 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5267d1d-ec1f-461d-acdc-57303aac7015" containerName="extract-content" Dec 05 16:01:54 crc kubenswrapper[4778]: E1205 16:01:54.839886 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c6979b-453d-49a4-889e-e46eff9af778" containerName="extract-content" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.839898 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c6979b-453d-49a4-889e-e46eff9af778" containerName="extract-content" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.840077 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c6979b-453d-49a4-889e-e46eff9af778" containerName="registry-server" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.840101 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cff4721-de89-49fa-9f19-682ec8ae8e64" containerName="registry-server" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.840118 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" containerName="registry-server" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.840134 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5267d1d-ec1f-461d-acdc-57303aac7015" containerName="registry-server" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.840149 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a3882a-e9bc-40d4-b51f-e47d9354f53a" containerName="marketplace-operator" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.840167 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a3882a-e9bc-40d4-b51f-e47d9354f53a" containerName="marketplace-operator" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.841298 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.845285 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.846579 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kfvb"] Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.866685 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e1fc109-ed5c-4277-8800-a02b2c92cd1c-catalog-content\") pod \"redhat-marketplace-8kfvb\" (UID: \"6e1fc109-ed5c-4277-8800-a02b2c92cd1c\") " pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.866798 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e1fc109-ed5c-4277-8800-a02b2c92cd1c-utilities\") pod \"redhat-marketplace-8kfvb\" (UID: \"6e1fc109-ed5c-4277-8800-a02b2c92cd1c\") " pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.866853 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhfzl\" (UniqueName: \"kubernetes.io/projected/6e1fc109-ed5c-4277-8800-a02b2c92cd1c-kube-api-access-mhfzl\") pod \"redhat-marketplace-8kfvb\" (UID: \"6e1fc109-ed5c-4277-8800-a02b2c92cd1c\") " pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.968156 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e1fc109-ed5c-4277-8800-a02b2c92cd1c-catalog-content\") pod \"redhat-marketplace-8kfvb\" (UID: \"6e1fc109-ed5c-4277-8800-a02b2c92cd1c\") " pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.968214 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e1fc109-ed5c-4277-8800-a02b2c92cd1c-utilities\") pod \"redhat-marketplace-8kfvb\" (UID: \"6e1fc109-ed5c-4277-8800-a02b2c92cd1c\") " pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.968239 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhfzl\" (UniqueName: \"kubernetes.io/projected/6e1fc109-ed5c-4277-8800-a02b2c92cd1c-kube-api-access-mhfzl\") pod \"redhat-marketplace-8kfvb\" (UID: \"6e1fc109-ed5c-4277-8800-a02b2c92cd1c\") " pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.968620 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e1fc109-ed5c-4277-8800-a02b2c92cd1c-catalog-content\") pod \"redhat-marketplace-8kfvb\" (UID: \"6e1fc109-ed5c-4277-8800-a02b2c92cd1c\") " pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.968704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e1fc109-ed5c-4277-8800-a02b2c92cd1c-utilities\") pod \"redhat-marketplace-8kfvb\" (UID: \"6e1fc109-ed5c-4277-8800-a02b2c92cd1c\") " pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:01:54 crc kubenswrapper[4778]: I1205 16:01:54.990834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhfzl\" (UniqueName: \"kubernetes.io/projected/6e1fc109-ed5c-4277-8800-a02b2c92cd1c-kube-api-access-mhfzl\") pod \"redhat-marketplace-8kfvb\" (UID: \"6e1fc109-ed5c-4277-8800-a02b2c92cd1c\") " pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.042509 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-whqgf"] Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.043896 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.048380 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.057921 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-whqgf"] Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.069836 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3bc4f1-bfa2-4b56-a478-8718c3c15bef-catalog-content\") pod \"redhat-operators-whqgf\" (UID: \"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef\") " pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.069902 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3bc4f1-bfa2-4b56-a478-8718c3c15bef-utilities\") pod \"redhat-operators-whqgf\" (UID: \"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef\") " pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.070011 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5g7\" (UniqueName: \"kubernetes.io/projected/fe3bc4f1-bfa2-4b56-a478-8718c3c15bef-kube-api-access-rv5g7\") pod \"redhat-operators-whqgf\" (UID: \"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef\") " pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.171816 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5g7\" (UniqueName: \"kubernetes.io/projected/fe3bc4f1-bfa2-4b56-a478-8718c3c15bef-kube-api-access-rv5g7\") pod \"redhat-operators-whqgf\" (UID: \"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef\") " pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.171897 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3bc4f1-bfa2-4b56-a478-8718c3c15bef-catalog-content\") pod \"redhat-operators-whqgf\" (UID: \"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef\") " pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.171919 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3bc4f1-bfa2-4b56-a478-8718c3c15bef-utilities\") pod \"redhat-operators-whqgf\" (UID: \"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef\") " pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.172428 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3bc4f1-bfa2-4b56-a478-8718c3c15bef-utilities\") pod \"redhat-operators-whqgf\" (UID: \"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef\") " pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.172759 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3bc4f1-bfa2-4b56-a478-8718c3c15bef-catalog-content\") pod \"redhat-operators-whqgf\" (UID: \"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef\") " pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.173328 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.191966 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5g7\" (UniqueName: \"kubernetes.io/projected/fe3bc4f1-bfa2-4b56-a478-8718c3c15bef-kube-api-access-rv5g7\") pod \"redhat-operators-whqgf\" (UID: \"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef\") " pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.262598 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66a3882a-e9bc-40d4-b51f-e47d9354f53a" path="/var/lib/kubelet/pods/66a3882a-e9bc-40d4-b51f-e47d9354f53a/volumes" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.263796 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c6979b-453d-49a4-889e-e46eff9af778" path="/var/lib/kubelet/pods/70c6979b-453d-49a4-889e-e46eff9af778/volumes" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.264413 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cff4721-de89-49fa-9f19-682ec8ae8e64" path="/var/lib/kubelet/pods/8cff4721-de89-49fa-9f19-682ec8ae8e64/volumes" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.265355 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5267d1d-ec1f-461d-acdc-57303aac7015" path="/var/lib/kubelet/pods/e5267d1d-ec1f-461d-acdc-57303aac7015/volumes" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.265949 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb1076bb-639d-42e5-ab8c-d13eb121cc95" path="/var/lib/kubelet/pods/eb1076bb-639d-42e5-ab8c-d13eb121cc95/volumes" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.364405 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.567514 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kfvb"] Dec 05 16:01:55 crc kubenswrapper[4778]: W1205 16:01:55.573469 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e1fc109_ed5c_4277_8800_a02b2c92cd1c.slice/crio-743f1eff92a292d61f742a16866c0b24af2dcb52619ddf166a13ecf280c10a25 WatchSource:0}: Error finding container 743f1eff92a292d61f742a16866c0b24af2dcb52619ddf166a13ecf280c10a25: Status 404 returned error can't find the container with id 743f1eff92a292d61f742a16866c0b24af2dcb52619ddf166a13ecf280c10a25 Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.717024 4778 generic.go:334] "Generic (PLEG): container finished" podID="6e1fc109-ed5c-4277-8800-a02b2c92cd1c" containerID="b3c30e61579ae1022f6ee25b57cb8f99b9718cd9d3c9a40ea1d6e1e50a0b88ac" exitCode=0 Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.717119 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kfvb" event={"ID":"6e1fc109-ed5c-4277-8800-a02b2c92cd1c","Type":"ContainerDied","Data":"b3c30e61579ae1022f6ee25b57cb8f99b9718cd9d3c9a40ea1d6e1e50a0b88ac"} Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.717176 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kfvb" event={"ID":"6e1fc109-ed5c-4277-8800-a02b2c92cd1c","Type":"ContainerStarted","Data":"743f1eff92a292d61f742a16866c0b24af2dcb52619ddf166a13ecf280c10a25"} Dec 05 16:01:55 crc kubenswrapper[4778]: I1205 16:01:55.749501 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-whqgf"] Dec 05 16:01:56 crc kubenswrapper[4778]: I1205 16:01:56.725312 4778 generic.go:334] "Generic (PLEG): container finished" podID="fe3bc4f1-bfa2-4b56-a478-8718c3c15bef" containerID="34593c81ad3e7ed80bb2ffe0e4bfeadf0aa95fde706be2fa026ad15f75797719" exitCode=0 Dec 05 16:01:56 crc kubenswrapper[4778]: I1205 16:01:56.725352 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whqgf" event={"ID":"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef","Type":"ContainerDied","Data":"34593c81ad3e7ed80bb2ffe0e4bfeadf0aa95fde706be2fa026ad15f75797719"} Dec 05 16:01:56 crc kubenswrapper[4778]: I1205 16:01:56.725689 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whqgf" event={"ID":"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef","Type":"ContainerStarted","Data":"dcdb7ce2972d84f60a7b7170cc835be51b76476cafeb2f3a7bd808f7cc4b1aff"} Dec 05 16:01:56 crc kubenswrapper[4778]: I1205 16:01:56.731998 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kfvb" event={"ID":"6e1fc109-ed5c-4277-8800-a02b2c92cd1c","Type":"ContainerStarted","Data":"25b1184fa18393deee670f51f1b16b490b7b105e16bb9f9249964b7333e09890"} Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.248464 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pqhjb"] Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.250784 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.256351 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.262942 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pqhjb"] Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.306143 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddvsh\" (UniqueName: \"kubernetes.io/projected/8101c50d-ea04-4fc7-a438-951874cc0351-kube-api-access-ddvsh\") pod \"community-operators-pqhjb\" (UID: \"8101c50d-ea04-4fc7-a438-951874cc0351\") " pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.306239 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8101c50d-ea04-4fc7-a438-951874cc0351-catalog-content\") pod \"community-operators-pqhjb\" (UID: \"8101c50d-ea04-4fc7-a438-951874cc0351\") " pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.306297 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8101c50d-ea04-4fc7-a438-951874cc0351-utilities\") pod \"community-operators-pqhjb\" (UID: \"8101c50d-ea04-4fc7-a438-951874cc0351\") " pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.407588 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8101c50d-ea04-4fc7-a438-951874cc0351-catalog-content\") pod \"community-operators-pqhjb\" (UID: \"8101c50d-ea04-4fc7-a438-951874cc0351\") " pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.407655 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8101c50d-ea04-4fc7-a438-951874cc0351-utilities\") pod \"community-operators-pqhjb\" (UID: \"8101c50d-ea04-4fc7-a438-951874cc0351\") " pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.407725 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddvsh\" (UniqueName: \"kubernetes.io/projected/8101c50d-ea04-4fc7-a438-951874cc0351-kube-api-access-ddvsh\") pod \"community-operators-pqhjb\" (UID: \"8101c50d-ea04-4fc7-a438-951874cc0351\") " pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.408210 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8101c50d-ea04-4fc7-a438-951874cc0351-catalog-content\") pod \"community-operators-pqhjb\" (UID: \"8101c50d-ea04-4fc7-a438-951874cc0351\") " pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.408266 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8101c50d-ea04-4fc7-a438-951874cc0351-utilities\") pod \"community-operators-pqhjb\" (UID: \"8101c50d-ea04-4fc7-a438-951874cc0351\") " pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.438105 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddvsh\" (UniqueName: \"kubernetes.io/projected/8101c50d-ea04-4fc7-a438-951874cc0351-kube-api-access-ddvsh\") pod \"community-operators-pqhjb\" (UID: \"8101c50d-ea04-4fc7-a438-951874cc0351\") " pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.445967 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v5hfn"] Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.447313 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.449746 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v5hfn"] Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.449760 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.509307 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8447\" (UniqueName: \"kubernetes.io/projected/193ca5ff-dd38-4ec6-aff7-3a43a42a12d9-kube-api-access-f8447\") pod \"certified-operators-v5hfn\" (UID: \"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9\") " pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.509615 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/193ca5ff-dd38-4ec6-aff7-3a43a42a12d9-catalog-content\") pod \"certified-operators-v5hfn\" (UID: \"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9\") " pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.509787 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/193ca5ff-dd38-4ec6-aff7-3a43a42a12d9-utilities\") pod \"certified-operators-v5hfn\" (UID: \"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9\") " pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.568807 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.617072 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/193ca5ff-dd38-4ec6-aff7-3a43a42a12d9-utilities\") pod \"certified-operators-v5hfn\" (UID: \"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9\") " pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.617694 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8447\" (UniqueName: \"kubernetes.io/projected/193ca5ff-dd38-4ec6-aff7-3a43a42a12d9-kube-api-access-f8447\") pod \"certified-operators-v5hfn\" (UID: \"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9\") " pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.617810 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/193ca5ff-dd38-4ec6-aff7-3a43a42a12d9-catalog-content\") pod \"certified-operators-v5hfn\" (UID: \"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9\") " pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.617886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/193ca5ff-dd38-4ec6-aff7-3a43a42a12d9-utilities\") pod \"certified-operators-v5hfn\" (UID: \"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9\") " pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.618481 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/193ca5ff-dd38-4ec6-aff7-3a43a42a12d9-catalog-content\") pod \"certified-operators-v5hfn\" (UID: \"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9\") " pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.641410 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8447\" (UniqueName: \"kubernetes.io/projected/193ca5ff-dd38-4ec6-aff7-3a43a42a12d9-kube-api-access-f8447\") pod \"certified-operators-v5hfn\" (UID: \"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9\") " pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.743175 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whqgf" event={"ID":"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef","Type":"ContainerStarted","Data":"37adbc5d571576d3eff7f4feaeac2d0a0f3b00ce1e3793a7e41dc0751643be95"} Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.750664 4778 generic.go:334] "Generic (PLEG): container finished" podID="6e1fc109-ed5c-4277-8800-a02b2c92cd1c" containerID="25b1184fa18393deee670f51f1b16b490b7b105e16bb9f9249964b7333e09890" exitCode=0 Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.750717 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kfvb" event={"ID":"6e1fc109-ed5c-4277-8800-a02b2c92cd1c","Type":"ContainerDied","Data":"25b1184fa18393deee670f51f1b16b490b7b105e16bb9f9249964b7333e09890"} Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.773553 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:01:57 crc kubenswrapper[4778]: I1205 16:01:57.960066 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pqhjb"] Dec 05 16:01:58 crc kubenswrapper[4778]: I1205 16:01:58.188060 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v5hfn"] Dec 05 16:01:58 crc kubenswrapper[4778]: W1205 16:01:58.227405 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod193ca5ff_dd38_4ec6_aff7_3a43a42a12d9.slice/crio-c995c67693383a61250071354e6f260476ca8c2024b5675fbc5e72984237acd4 WatchSource:0}: Error finding container c995c67693383a61250071354e6f260476ca8c2024b5675fbc5e72984237acd4: Status 404 returned error can't find the container with id c995c67693383a61250071354e6f260476ca8c2024b5675fbc5e72984237acd4 Dec 05 16:01:58 crc kubenswrapper[4778]: E1205 16:01:58.234321 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8101c50d_ea04_4fc7_a438_951874cc0351.slice/crio-b46450ce8235650dda4686c01f0d7214efc89adfcb423112349da67f38fd3aa7.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:01:58 crc kubenswrapper[4778]: I1205 16:01:58.759479 4778 generic.go:334] "Generic (PLEG): container finished" podID="8101c50d-ea04-4fc7-a438-951874cc0351" containerID="b46450ce8235650dda4686c01f0d7214efc89adfcb423112349da67f38fd3aa7" exitCode=0 Dec 05 16:01:58 crc kubenswrapper[4778]: I1205 16:01:58.759807 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqhjb" event={"ID":"8101c50d-ea04-4fc7-a438-951874cc0351","Type":"ContainerDied","Data":"b46450ce8235650dda4686c01f0d7214efc89adfcb423112349da67f38fd3aa7"} Dec 05 16:01:58 crc kubenswrapper[4778]: I1205 16:01:58.760102 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqhjb" event={"ID":"8101c50d-ea04-4fc7-a438-951874cc0351","Type":"ContainerStarted","Data":"c3c26d3e570d606a3e22e2b7f99d693aaf8607bc3d1c7d888477607e9c653e1c"} Dec 05 16:01:58 crc kubenswrapper[4778]: I1205 16:01:58.762638 4778 generic.go:334] "Generic (PLEG): container finished" podID="fe3bc4f1-bfa2-4b56-a478-8718c3c15bef" containerID="37adbc5d571576d3eff7f4feaeac2d0a0f3b00ce1e3793a7e41dc0751643be95" exitCode=0 Dec 05 16:01:58 crc kubenswrapper[4778]: I1205 16:01:58.762711 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whqgf" event={"ID":"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef","Type":"ContainerDied","Data":"37adbc5d571576d3eff7f4feaeac2d0a0f3b00ce1e3793a7e41dc0751643be95"} Dec 05 16:01:58 crc kubenswrapper[4778]: I1205 16:01:58.764857 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kfvb" event={"ID":"6e1fc109-ed5c-4277-8800-a02b2c92cd1c","Type":"ContainerStarted","Data":"40b26dc1bab838506b6279641ce5d02d04fbaa00794c46eb6900c57ed2fc6fdb"} Dec 05 16:01:58 crc kubenswrapper[4778]: I1205 16:01:58.772246 4778 generic.go:334] "Generic (PLEG): container finished" podID="193ca5ff-dd38-4ec6-aff7-3a43a42a12d9" containerID="060b0cf44bc84923caf054654f3af67c53e91f7d57ca19ca823262aaf8f3f6a4" exitCode=0 Dec 05 16:01:58 crc kubenswrapper[4778]: I1205 16:01:58.772299 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5hfn" event={"ID":"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9","Type":"ContainerDied","Data":"060b0cf44bc84923caf054654f3af67c53e91f7d57ca19ca823262aaf8f3f6a4"} Dec 05 16:01:58 crc kubenswrapper[4778]: I1205 16:01:58.772327 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5hfn" event={"ID":"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9","Type":"ContainerStarted","Data":"c995c67693383a61250071354e6f260476ca8c2024b5675fbc5e72984237acd4"} Dec 05 16:01:58 crc kubenswrapper[4778]: I1205 16:01:58.821517 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8kfvb" podStartSLOduration=2.3451172639999998 podStartE2EDuration="4.821496631s" podCreationTimestamp="2025-12-05 16:01:54 +0000 UTC" firstStartedPulling="2025-12-05 16:01:55.718136636 +0000 UTC m=+402.821933016" lastFinishedPulling="2025-12-05 16:01:58.194516003 +0000 UTC m=+405.298312383" observedRunningTime="2025-12-05 16:01:58.819654729 +0000 UTC m=+405.923451119" watchObservedRunningTime="2025-12-05 16:01:58.821496631 +0000 UTC m=+405.925293011" Dec 05 16:01:59 crc kubenswrapper[4778]: I1205 16:01:59.778169 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5hfn" event={"ID":"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9","Type":"ContainerStarted","Data":"8aaddd10ac598eff4169717cfe8907c4ee8b626449f64550e9115cc8a556ae2d"} Dec 05 16:01:59 crc kubenswrapper[4778]: I1205 16:01:59.787582 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whqgf" event={"ID":"fe3bc4f1-bfa2-4b56-a478-8718c3c15bef","Type":"ContainerStarted","Data":"77501bca777b6e9d33936a5da7a4551d91a7a435da860bed083c5e716a752832"} Dec 05 16:01:59 crc kubenswrapper[4778]: I1205 16:01:59.826733 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-whqgf" podStartSLOduration=2.295039616 podStartE2EDuration="4.826707493s" podCreationTimestamp="2025-12-05 16:01:55 +0000 UTC" firstStartedPulling="2025-12-05 16:01:56.72698705 +0000 UTC m=+403.830783430" lastFinishedPulling="2025-12-05 16:01:59.258654917 +0000 UTC m=+406.362451307" observedRunningTime="2025-12-05 16:01:59.822168675 +0000 UTC m=+406.925965055" watchObservedRunningTime="2025-12-05 16:01:59.826707493 +0000 UTC m=+406.930503873" Dec 05 16:02:00 crc kubenswrapper[4778]: I1205 16:02:00.087930 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" podUID="7d0c5c29-5367-41e6-be46-e23a9ac5e281" containerName="registry" containerID="cri-o://01ce86f1a69afcbf4f879cf61543dd7e1ed6b1664ee2e3a6a6462e786b47d2fd" gracePeriod=30 Dec 05 16:02:00 crc kubenswrapper[4778]: I1205 16:02:00.793570 4778 generic.go:334] "Generic (PLEG): container finished" podID="8101c50d-ea04-4fc7-a438-951874cc0351" containerID="06d23c0c0e3f425836f0f2d2a3dbbf2c4a4c15287383a77f261307578f1fbec3" exitCode=0 Dec 05 16:02:00 crc kubenswrapper[4778]: I1205 16:02:00.793681 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqhjb" event={"ID":"8101c50d-ea04-4fc7-a438-951874cc0351","Type":"ContainerDied","Data":"06d23c0c0e3f425836f0f2d2a3dbbf2c4a4c15287383a77f261307578f1fbec3"} Dec 05 16:02:00 crc kubenswrapper[4778]: I1205 16:02:00.796460 4778 generic.go:334] "Generic (PLEG): container finished" podID="7d0c5c29-5367-41e6-be46-e23a9ac5e281" containerID="01ce86f1a69afcbf4f879cf61543dd7e1ed6b1664ee2e3a6a6462e786b47d2fd" exitCode=0 Dec 05 16:02:00 crc kubenswrapper[4778]: I1205 16:02:00.796530 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" event={"ID":"7d0c5c29-5367-41e6-be46-e23a9ac5e281","Type":"ContainerDied","Data":"01ce86f1a69afcbf4f879cf61543dd7e1ed6b1664ee2e3a6a6462e786b47d2fd"} Dec 05 16:02:00 crc kubenswrapper[4778]: I1205 16:02:00.796554 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" event={"ID":"7d0c5c29-5367-41e6-be46-e23a9ac5e281","Type":"ContainerDied","Data":"0dc874c0973c58e30d428e8515dbcc5cd901ec22c61825fd87fda006317a1557"} Dec 05 16:02:00 crc kubenswrapper[4778]: I1205 16:02:00.796566 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dc874c0973c58e30d428e8515dbcc5cd901ec22c61825fd87fda006317a1557" Dec 05 16:02:00 crc kubenswrapper[4778]: I1205 16:02:00.799350 4778 generic.go:334] "Generic (PLEG): container finished" podID="193ca5ff-dd38-4ec6-aff7-3a43a42a12d9" containerID="8aaddd10ac598eff4169717cfe8907c4ee8b626449f64550e9115cc8a556ae2d" exitCode=0 Dec 05 16:02:00 crc kubenswrapper[4778]: I1205 16:02:00.799773 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5hfn" event={"ID":"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9","Type":"ContainerDied","Data":"8aaddd10ac598eff4169717cfe8907c4ee8b626449f64550e9115cc8a556ae2d"} Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.703694 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.771658 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d0c5c29-5367-41e6-be46-e23a9ac5e281-ca-trust-extracted\") pod \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.772004 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.772060 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz2lx\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-kube-api-access-dz2lx\") pod \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.772153 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d0c5c29-5367-41e6-be46-e23a9ac5e281-installation-pull-secrets\") pod \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.773011 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-bound-sa-token\") pod \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.773057 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d0c5c29-5367-41e6-be46-e23a9ac5e281-registry-certificates\") pod \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.773080 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d0c5c29-5367-41e6-be46-e23a9ac5e281-trusted-ca\") pod \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.773115 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-registry-tls\") pod \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\" (UID: \"7d0c5c29-5367-41e6-be46-e23a9ac5e281\") " Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.773948 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0c5c29-5367-41e6-be46-e23a9ac5e281-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7d0c5c29-5367-41e6-be46-e23a9ac5e281" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.773963 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0c5c29-5367-41e6-be46-e23a9ac5e281-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7d0c5c29-5367-41e6-be46-e23a9ac5e281" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.777969 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7d0c5c29-5367-41e6-be46-e23a9ac5e281" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.779192 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-kube-api-access-dz2lx" (OuterVolumeSpecName: "kube-api-access-dz2lx") pod "7d0c5c29-5367-41e6-be46-e23a9ac5e281" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281"). InnerVolumeSpecName "kube-api-access-dz2lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.783878 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d0c5c29-5367-41e6-be46-e23a9ac5e281-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7d0c5c29-5367-41e6-be46-e23a9ac5e281" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.787767 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7d0c5c29-5367-41e6-be46-e23a9ac5e281" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.793512 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d0c5c29-5367-41e6-be46-e23a9ac5e281-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7d0c5c29-5367-41e6-be46-e23a9ac5e281" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.794473 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7d0c5c29-5367-41e6-be46-e23a9ac5e281" (UID: "7d0c5c29-5367-41e6-be46-e23a9ac5e281"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.809087 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5hfn" event={"ID":"193ca5ff-dd38-4ec6-aff7-3a43a42a12d9","Type":"ContainerStarted","Data":"c1ca2a598bb28fd8f9f8604d4b3b36b587fd793e3720bf3f7d05a8599e25d6f5"} Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.811662 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqhjb" event={"ID":"8101c50d-ea04-4fc7-a438-951874cc0351","Type":"ContainerStarted","Data":"dcdf4085fd273a657e76a45a75f1eb4d7cdb1854b9b45a6075219a329e379b26"} Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.812035 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m6452" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.837232 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v5hfn" podStartSLOduration=2.187362738 podStartE2EDuration="4.837211698s" podCreationTimestamp="2025-12-05 16:01:57 +0000 UTC" firstStartedPulling="2025-12-05 16:01:58.77402491 +0000 UTC m=+405.877821290" lastFinishedPulling="2025-12-05 16:02:01.42387387 +0000 UTC m=+408.527670250" observedRunningTime="2025-12-05 16:02:01.826504089 +0000 UTC m=+408.930300469" watchObservedRunningTime="2025-12-05 16:02:01.837211698 +0000 UTC m=+408.941008078" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.843809 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pqhjb" podStartSLOduration=2.3652533350000002 podStartE2EDuration="4.843791853s" podCreationTimestamp="2025-12-05 16:01:57 +0000 UTC" firstStartedPulling="2025-12-05 16:01:58.762640431 +0000 UTC m=+405.866436811" lastFinishedPulling="2025-12-05 16:02:01.241178949 +0000 UTC m=+408.344975329" observedRunningTime="2025-12-05 16:02:01.84261855 +0000 UTC m=+408.946414930" watchObservedRunningTime="2025-12-05 16:02:01.843791853 +0000 UTC m=+408.947588233" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.861899 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m6452"] Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.869086 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m6452"] Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.874299 4778 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d0c5c29-5367-41e6-be46-e23a9ac5e281-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.874334 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz2lx\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-kube-api-access-dz2lx\") on node \"crc\" DevicePath \"\"" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.874345 4778 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d0c5c29-5367-41e6-be46-e23a9ac5e281-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.874353 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.874376 4778 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d0c5c29-5367-41e6-be46-e23a9ac5e281-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.874388 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d0c5c29-5367-41e6-be46-e23a9ac5e281-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:02:01 crc kubenswrapper[4778]: I1205 16:02:01.874398 4778 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d0c5c29-5367-41e6-be46-e23a9ac5e281-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:02:03 crc kubenswrapper[4778]: I1205 16:02:03.258569 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d0c5c29-5367-41e6-be46-e23a9ac5e281" path="/var/lib/kubelet/pods/7d0c5c29-5367-41e6-be46-e23a9ac5e281/volumes" Dec 05 16:02:05 crc kubenswrapper[4778]: I1205 16:02:05.173433 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:02:05 crc kubenswrapper[4778]: I1205 16:02:05.173782 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:02:05 crc kubenswrapper[4778]: I1205 16:02:05.222397 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:02:05 crc kubenswrapper[4778]: I1205 16:02:05.364964 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:02:05 crc kubenswrapper[4778]: I1205 16:02:05.365080 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:02:05 crc kubenswrapper[4778]: I1205 16:02:05.408961 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:02:05 crc kubenswrapper[4778]: I1205 16:02:05.933339 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8kfvb" Dec 05 16:02:05 crc kubenswrapper[4778]: I1205 16:02:05.938645 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-whqgf" Dec 05 16:02:07 crc kubenswrapper[4778]: I1205 16:02:07.570245 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:02:07 crc kubenswrapper[4778]: I1205 16:02:07.570399 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:02:07 crc kubenswrapper[4778]: I1205 16:02:07.624921 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:02:07 crc kubenswrapper[4778]: I1205 16:02:07.774147 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:02:07 crc kubenswrapper[4778]: I1205 16:02:07.774221 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:02:07 crc kubenswrapper[4778]: I1205 16:02:07.813627 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:02:07 crc kubenswrapper[4778]: I1205 16:02:07.946489 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pqhjb" Dec 05 16:02:07 crc kubenswrapper[4778]: I1205 16:02:07.973634 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v5hfn" Dec 05 16:03:33 crc kubenswrapper[4778]: I1205 16:03:33.414660 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:03:33 crc kubenswrapper[4778]: I1205 16:03:33.415333 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:04:03 crc kubenswrapper[4778]: I1205 16:04:03.415225 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:04:03 crc kubenswrapper[4778]: I1205 16:04:03.416578 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:04:13 crc kubenswrapper[4778]: I1205 16:04:13.505791 4778 scope.go:117] "RemoveContainer" containerID="01ce86f1a69afcbf4f879cf61543dd7e1ed6b1664ee2e3a6a6462e786b47d2fd" Dec 05 16:04:33 crc kubenswrapper[4778]: I1205 16:04:33.414538 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:04:33 crc kubenswrapper[4778]: I1205 16:04:33.416904 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:04:33 crc kubenswrapper[4778]: I1205 16:04:33.417120 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:04:33 crc kubenswrapper[4778]: I1205 16:04:33.418129 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf4aa3e6482ce3f53fbdeb198457c65bd5fd850856e011867fd9a803a7e3ab35"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:04:33 crc kubenswrapper[4778]: I1205 16:04:33.418423 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://bf4aa3e6482ce3f53fbdeb198457c65bd5fd850856e011867fd9a803a7e3ab35" gracePeriod=600 Dec 05 16:04:33 crc kubenswrapper[4778]: I1205 16:04:33.809666 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="bf4aa3e6482ce3f53fbdeb198457c65bd5fd850856e011867fd9a803a7e3ab35" exitCode=0 Dec 05 16:04:33 crc kubenswrapper[4778]: I1205 16:04:33.809722 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"bf4aa3e6482ce3f53fbdeb198457c65bd5fd850856e011867fd9a803a7e3ab35"} Dec 05 16:04:33 crc kubenswrapper[4778]: I1205 16:04:33.810432 4778 scope.go:117] "RemoveContainer" containerID="3d94c7d5a3c87642bb8001766b12e02cd4c800446b73b1de7ad07648c1824c6e" Dec 05 16:04:35 crc kubenswrapper[4778]: I1205 16:04:35.823281 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"2f0eb734c5f784238f5df16ab2b9c81ee74d1805c2a6835ca18eb607dfb3dd7b"} Dec 05 16:07:03 crc kubenswrapper[4778]: I1205 16:07:03.415036 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:07:03 crc kubenswrapper[4778]: I1205 16:07:03.415844 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:07:33 crc kubenswrapper[4778]: I1205 16:07:33.414950 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:07:33 crc kubenswrapper[4778]: I1205 16:07:33.415740 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:07:52 crc kubenswrapper[4778]: I1205 16:07:52.544593 4778 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 16:07:57 crc kubenswrapper[4778]: I1205 16:07:57.835306 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c2l79"] Dec 05 16:07:57 crc kubenswrapper[4778]: E1205 16:07:57.837716 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0c5c29-5367-41e6-be46-e23a9ac5e281" containerName="registry" Dec 05 16:07:57 crc kubenswrapper[4778]: I1205 16:07:57.837737 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0c5c29-5367-41e6-be46-e23a9ac5e281" containerName="registry" Dec 05 16:07:57 crc kubenswrapper[4778]: I1205 16:07:57.837909 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d0c5c29-5367-41e6-be46-e23a9ac5e281" containerName="registry" Dec 05 16:07:57 crc kubenswrapper[4778]: I1205 16:07:57.839019 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:07:57 crc kubenswrapper[4778]: I1205 16:07:57.847675 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2l79"] Dec 05 16:07:57 crc kubenswrapper[4778]: I1205 16:07:57.952532 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phnjx\" (UniqueName: \"kubernetes.io/projected/132faad2-67d4-46f7-81d2-24df949bb199-kube-api-access-phnjx\") pod \"certified-operators-c2l79\" (UID: \"132faad2-67d4-46f7-81d2-24df949bb199\") " pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:07:57 crc kubenswrapper[4778]: I1205 16:07:57.952593 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132faad2-67d4-46f7-81d2-24df949bb199-catalog-content\") pod \"certified-operators-c2l79\" (UID: \"132faad2-67d4-46f7-81d2-24df949bb199\") " pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:07:57 crc kubenswrapper[4778]: I1205 16:07:57.952625 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132faad2-67d4-46f7-81d2-24df949bb199-utilities\") pod \"certified-operators-c2l79\" (UID: \"132faad2-67d4-46f7-81d2-24df949bb199\") " pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:07:58 crc kubenswrapper[4778]: I1205 16:07:58.054160 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phnjx\" (UniqueName: \"kubernetes.io/projected/132faad2-67d4-46f7-81d2-24df949bb199-kube-api-access-phnjx\") pod \"certified-operators-c2l79\" (UID: \"132faad2-67d4-46f7-81d2-24df949bb199\") " pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:07:58 crc kubenswrapper[4778]: I1205 16:07:58.054205 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132faad2-67d4-46f7-81d2-24df949bb199-catalog-content\") pod \"certified-operators-c2l79\" (UID: \"132faad2-67d4-46f7-81d2-24df949bb199\") " pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:07:58 crc kubenswrapper[4778]: I1205 16:07:58.054226 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132faad2-67d4-46f7-81d2-24df949bb199-utilities\") pod \"certified-operators-c2l79\" (UID: \"132faad2-67d4-46f7-81d2-24df949bb199\") " pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:07:58 crc kubenswrapper[4778]: I1205 16:07:58.054739 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132faad2-67d4-46f7-81d2-24df949bb199-utilities\") pod \"certified-operators-c2l79\" (UID: \"132faad2-67d4-46f7-81d2-24df949bb199\") " pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:07:58 crc kubenswrapper[4778]: I1205 16:07:58.054798 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132faad2-67d4-46f7-81d2-24df949bb199-catalog-content\") pod \"certified-operators-c2l79\" (UID: \"132faad2-67d4-46f7-81d2-24df949bb199\") " pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:07:58 crc kubenswrapper[4778]: I1205 16:07:58.071393 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phnjx\" (UniqueName: \"kubernetes.io/projected/132faad2-67d4-46f7-81d2-24df949bb199-kube-api-access-phnjx\") pod \"certified-operators-c2l79\" (UID: \"132faad2-67d4-46f7-81d2-24df949bb199\") " pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:07:58 crc kubenswrapper[4778]: I1205 16:07:58.166268 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:07:58 crc kubenswrapper[4778]: W1205 16:07:58.427967 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod132faad2_67d4_46f7_81d2_24df949bb199.slice/crio-fbde058e5a0f7e11dc70e5a13ab5216124b97155a5a45eab36f0339b29d2eadc WatchSource:0}: Error finding container fbde058e5a0f7e11dc70e5a13ab5216124b97155a5a45eab36f0339b29d2eadc: Status 404 returned error can't find the container with id fbde058e5a0f7e11dc70e5a13ab5216124b97155a5a45eab36f0339b29d2eadc Dec 05 16:07:58 crc kubenswrapper[4778]: I1205 16:07:58.432430 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2l79"] Dec 05 16:07:59 crc kubenswrapper[4778]: I1205 16:07:59.174228 4778 generic.go:334] "Generic (PLEG): container finished" podID="132faad2-67d4-46f7-81d2-24df949bb199" containerID="96efac5a86b0fb12592da93b20ca9621342c2212f3422ff116a33e6f2a8de890" exitCode=0 Dec 05 16:07:59 crc kubenswrapper[4778]: I1205 16:07:59.174287 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2l79" event={"ID":"132faad2-67d4-46f7-81d2-24df949bb199","Type":"ContainerDied","Data":"96efac5a86b0fb12592da93b20ca9621342c2212f3422ff116a33e6f2a8de890"} Dec 05 16:07:59 crc kubenswrapper[4778]: I1205 16:07:59.174683 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2l79" event={"ID":"132faad2-67d4-46f7-81d2-24df949bb199","Type":"ContainerStarted","Data":"fbde058e5a0f7e11dc70e5a13ab5216124b97155a5a45eab36f0339b29d2eadc"} Dec 05 16:07:59 crc kubenswrapper[4778]: I1205 16:07:59.178660 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:08:00 crc kubenswrapper[4778]: I1205 16:08:00.183126 4778 generic.go:334] "Generic (PLEG): container finished" podID="132faad2-67d4-46f7-81d2-24df949bb199" containerID="efa623c2c71b9e0d25d976e0facf8e2c8c8772d5a189893d137d3d5ef7b39e62" exitCode=0 Dec 05 16:08:00 crc kubenswrapper[4778]: I1205 16:08:00.183177 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2l79" event={"ID":"132faad2-67d4-46f7-81d2-24df949bb199","Type":"ContainerDied","Data":"efa623c2c71b9e0d25d976e0facf8e2c8c8772d5a189893d137d3d5ef7b39e62"} Dec 05 16:08:01 crc kubenswrapper[4778]: I1205 16:08:01.195467 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2l79" event={"ID":"132faad2-67d4-46f7-81d2-24df949bb199","Type":"ContainerStarted","Data":"475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8"} Dec 05 16:08:01 crc kubenswrapper[4778]: I1205 16:08:01.233905 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c2l79" podStartSLOduration=2.568341933 podStartE2EDuration="4.233880508s" podCreationTimestamp="2025-12-05 16:07:57 +0000 UTC" firstStartedPulling="2025-12-05 16:07:59.177942934 +0000 UTC m=+766.281739354" lastFinishedPulling="2025-12-05 16:08:00.843481509 +0000 UTC m=+767.947277929" observedRunningTime="2025-12-05 16:08:01.227979459 +0000 UTC m=+768.331775879" watchObservedRunningTime="2025-12-05 16:08:01.233880508 +0000 UTC m=+768.337676928" Dec 05 16:08:03 crc kubenswrapper[4778]: I1205 16:08:03.415045 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:08:03 crc kubenswrapper[4778]: I1205 16:08:03.416152 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:08:03 crc kubenswrapper[4778]: I1205 16:08:03.416350 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:08:03 crc kubenswrapper[4778]: I1205 16:08:03.417462 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2f0eb734c5f784238f5df16ab2b9c81ee74d1805c2a6835ca18eb607dfb3dd7b"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:08:03 crc kubenswrapper[4778]: I1205 16:08:03.417737 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://2f0eb734c5f784238f5df16ab2b9c81ee74d1805c2a6835ca18eb607dfb3dd7b" gracePeriod=600 Dec 05 16:08:04 crc kubenswrapper[4778]: I1205 16:08:04.215012 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="2f0eb734c5f784238f5df16ab2b9c81ee74d1805c2a6835ca18eb607dfb3dd7b" exitCode=0 Dec 05 16:08:04 crc kubenswrapper[4778]: I1205 16:08:04.215047 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"2f0eb734c5f784238f5df16ab2b9c81ee74d1805c2a6835ca18eb607dfb3dd7b"} Dec 05 16:08:04 crc kubenswrapper[4778]: I1205 16:08:04.215516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"d05f43ec797c17341e7f030a46399a4fd0a9ce3922c28bd4bd201675fc830e2a"} Dec 05 16:08:04 crc kubenswrapper[4778]: I1205 16:08:04.215545 4778 scope.go:117] "RemoveContainer" containerID="bf4aa3e6482ce3f53fbdeb198457c65bd5fd850856e011867fd9a803a7e3ab35" Dec 05 16:08:08 crc kubenswrapper[4778]: I1205 16:08:08.167547 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:08:08 crc kubenswrapper[4778]: I1205 16:08:08.168102 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:08:08 crc kubenswrapper[4778]: I1205 16:08:08.241466 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:08:08 crc kubenswrapper[4778]: I1205 16:08:08.321667 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:08:08 crc kubenswrapper[4778]: I1205 16:08:08.503917 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2l79"] Dec 05 16:08:10 crc kubenswrapper[4778]: I1205 16:08:10.258345 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c2l79" podUID="132faad2-67d4-46f7-81d2-24df949bb199" containerName="registry-server" containerID="cri-o://475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8" gracePeriod=2 Dec 05 16:08:10 crc kubenswrapper[4778]: I1205 16:08:10.699099 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:08:10 crc kubenswrapper[4778]: I1205 16:08:10.827331 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phnjx\" (UniqueName: \"kubernetes.io/projected/132faad2-67d4-46f7-81d2-24df949bb199-kube-api-access-phnjx\") pod \"132faad2-67d4-46f7-81d2-24df949bb199\" (UID: \"132faad2-67d4-46f7-81d2-24df949bb199\") " Dec 05 16:08:10 crc kubenswrapper[4778]: I1205 16:08:10.827474 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132faad2-67d4-46f7-81d2-24df949bb199-catalog-content\") pod \"132faad2-67d4-46f7-81d2-24df949bb199\" (UID: \"132faad2-67d4-46f7-81d2-24df949bb199\") " Dec 05 16:08:10 crc kubenswrapper[4778]: I1205 16:08:10.827676 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132faad2-67d4-46f7-81d2-24df949bb199-utilities\") pod \"132faad2-67d4-46f7-81d2-24df949bb199\" (UID: \"132faad2-67d4-46f7-81d2-24df949bb199\") " Dec 05 16:08:10 crc kubenswrapper[4778]: I1205 16:08:10.829580 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132faad2-67d4-46f7-81d2-24df949bb199-utilities" (OuterVolumeSpecName: "utilities") pod "132faad2-67d4-46f7-81d2-24df949bb199" (UID: "132faad2-67d4-46f7-81d2-24df949bb199"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:08:10 crc kubenswrapper[4778]: I1205 16:08:10.837637 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/132faad2-67d4-46f7-81d2-24df949bb199-kube-api-access-phnjx" (OuterVolumeSpecName: "kube-api-access-phnjx") pod "132faad2-67d4-46f7-81d2-24df949bb199" (UID: "132faad2-67d4-46f7-81d2-24df949bb199"). InnerVolumeSpecName "kube-api-access-phnjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:08:10 crc kubenswrapper[4778]: I1205 16:08:10.914124 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132faad2-67d4-46f7-81d2-24df949bb199-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "132faad2-67d4-46f7-81d2-24df949bb199" (UID: "132faad2-67d4-46f7-81d2-24df949bb199"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:08:10 crc kubenswrapper[4778]: I1205 16:08:10.929482 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phnjx\" (UniqueName: \"kubernetes.io/projected/132faad2-67d4-46f7-81d2-24df949bb199-kube-api-access-phnjx\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:10 crc kubenswrapper[4778]: I1205 16:08:10.929558 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132faad2-67d4-46f7-81d2-24df949bb199-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:10 crc kubenswrapper[4778]: I1205 16:08:10.929578 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132faad2-67d4-46f7-81d2-24df949bb199-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.264651 4778 generic.go:334] "Generic (PLEG): container finished" podID="132faad2-67d4-46f7-81d2-24df949bb199" containerID="475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8" exitCode=0 Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.264690 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2l79" event={"ID":"132faad2-67d4-46f7-81d2-24df949bb199","Type":"ContainerDied","Data":"475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8"} Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.264720 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2l79" event={"ID":"132faad2-67d4-46f7-81d2-24df949bb199","Type":"ContainerDied","Data":"fbde058e5a0f7e11dc70e5a13ab5216124b97155a5a45eab36f0339b29d2eadc"} Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.264741 4778 scope.go:117] "RemoveContainer" containerID="475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8" Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.264743 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2l79" Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.286274 4778 scope.go:117] "RemoveContainer" containerID="efa623c2c71b9e0d25d976e0facf8e2c8c8772d5a189893d137d3d5ef7b39e62" Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.311683 4778 scope.go:117] "RemoveContainer" containerID="96efac5a86b0fb12592da93b20ca9621342c2212f3422ff116a33e6f2a8de890" Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.313593 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2l79"] Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.321566 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c2l79"] Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.332130 4778 scope.go:117] "RemoveContainer" containerID="475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8" Dec 05 16:08:11 crc kubenswrapper[4778]: E1205 16:08:11.332865 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8\": container with ID starting with 475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8 not found: ID does not exist" containerID="475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8" Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.332911 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8"} err="failed to get container status \"475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8\": rpc error: code = NotFound desc = could not find container \"475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8\": container with ID starting with 475df2cc72dc4512d44c59e602c6e5f0672d031447dea3e32456daea8b5f2ea8 not found: ID does not exist" Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.332942 4778 scope.go:117] "RemoveContainer" containerID="efa623c2c71b9e0d25d976e0facf8e2c8c8772d5a189893d137d3d5ef7b39e62" Dec 05 16:08:11 crc kubenswrapper[4778]: E1205 16:08:11.333314 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efa623c2c71b9e0d25d976e0facf8e2c8c8772d5a189893d137d3d5ef7b39e62\": container with ID starting with efa623c2c71b9e0d25d976e0facf8e2c8c8772d5a189893d137d3d5ef7b39e62 not found: ID does not exist" containerID="efa623c2c71b9e0d25d976e0facf8e2c8c8772d5a189893d137d3d5ef7b39e62" Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.333523 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa623c2c71b9e0d25d976e0facf8e2c8c8772d5a189893d137d3d5ef7b39e62"} err="failed to get container status \"efa623c2c71b9e0d25d976e0facf8e2c8c8772d5a189893d137d3d5ef7b39e62\": rpc error: code = NotFound desc = could not find container \"efa623c2c71b9e0d25d976e0facf8e2c8c8772d5a189893d137d3d5ef7b39e62\": container with ID starting with efa623c2c71b9e0d25d976e0facf8e2c8c8772d5a189893d137d3d5ef7b39e62 not found: ID does not exist" Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.333570 4778 scope.go:117] "RemoveContainer" containerID="96efac5a86b0fb12592da93b20ca9621342c2212f3422ff116a33e6f2a8de890" Dec 05 16:08:11 crc kubenswrapper[4778]: E1205 16:08:11.333976 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96efac5a86b0fb12592da93b20ca9621342c2212f3422ff116a33e6f2a8de890\": container with ID starting with 96efac5a86b0fb12592da93b20ca9621342c2212f3422ff116a33e6f2a8de890 not found: ID does not exist" containerID="96efac5a86b0fb12592da93b20ca9621342c2212f3422ff116a33e6f2a8de890" Dec 05 16:08:11 crc kubenswrapper[4778]: I1205 16:08:11.334017 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96efac5a86b0fb12592da93b20ca9621342c2212f3422ff116a33e6f2a8de890"} err="failed to get container status \"96efac5a86b0fb12592da93b20ca9621342c2212f3422ff116a33e6f2a8de890\": rpc error: code = NotFound desc = could not find container \"96efac5a86b0fb12592da93b20ca9621342c2212f3422ff116a33e6f2a8de890\": container with ID starting with 96efac5a86b0fb12592da93b20ca9621342c2212f3422ff116a33e6f2a8de890 not found: ID does not exist" Dec 05 16:08:13 crc kubenswrapper[4778]: I1205 16:08:13.262841 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="132faad2-67d4-46f7-81d2-24df949bb199" path="/var/lib/kubelet/pods/132faad2-67d4-46f7-81d2-24df949bb199/volumes" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.434857 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn"] Dec 05 16:08:29 crc kubenswrapper[4778]: E1205 16:08:29.436777 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132faad2-67d4-46f7-81d2-24df949bb199" containerName="extract-utilities" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.436810 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="132faad2-67d4-46f7-81d2-24df949bb199" containerName="extract-utilities" Dec 05 16:08:29 crc kubenswrapper[4778]: E1205 16:08:29.436830 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132faad2-67d4-46f7-81d2-24df949bb199" containerName="registry-server" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.436838 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="132faad2-67d4-46f7-81d2-24df949bb199" containerName="registry-server" Dec 05 16:08:29 crc kubenswrapper[4778]: E1205 16:08:29.436853 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132faad2-67d4-46f7-81d2-24df949bb199" containerName="extract-content" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.436861 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="132faad2-67d4-46f7-81d2-24df949bb199" containerName="extract-content" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.436995 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="132faad2-67d4-46f7-81d2-24df949bb199" containerName="registry-server" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.437849 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.439947 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.444847 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn"] Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.477351 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7tls\" (UniqueName: \"kubernetes.io/projected/4b63bba1-6d00-4e21-81b5-9c2573000afd-kube-api-access-g7tls\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn\" (UID: \"4b63bba1-6d00-4e21-81b5-9c2573000afd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.477553 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b63bba1-6d00-4e21-81b5-9c2573000afd-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn\" (UID: \"4b63bba1-6d00-4e21-81b5-9c2573000afd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.477689 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b63bba1-6d00-4e21-81b5-9c2573000afd-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn\" (UID: \"4b63bba1-6d00-4e21-81b5-9c2573000afd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.579578 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7tls\" (UniqueName: \"kubernetes.io/projected/4b63bba1-6d00-4e21-81b5-9c2573000afd-kube-api-access-g7tls\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn\" (UID: \"4b63bba1-6d00-4e21-81b5-9c2573000afd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.579688 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b63bba1-6d00-4e21-81b5-9c2573000afd-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn\" (UID: \"4b63bba1-6d00-4e21-81b5-9c2573000afd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.579748 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b63bba1-6d00-4e21-81b5-9c2573000afd-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn\" (UID: \"4b63bba1-6d00-4e21-81b5-9c2573000afd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.580335 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b63bba1-6d00-4e21-81b5-9c2573000afd-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn\" (UID: \"4b63bba1-6d00-4e21-81b5-9c2573000afd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.580330 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b63bba1-6d00-4e21-81b5-9c2573000afd-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn\" (UID: \"4b63bba1-6d00-4e21-81b5-9c2573000afd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.599644 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7tls\" (UniqueName: \"kubernetes.io/projected/4b63bba1-6d00-4e21-81b5-9c2573000afd-kube-api-access-g7tls\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn\" (UID: \"4b63bba1-6d00-4e21-81b5-9c2573000afd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:29 crc kubenswrapper[4778]: I1205 16:08:29.763659 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:30 crc kubenswrapper[4778]: I1205 16:08:30.171167 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn"] Dec 05 16:08:30 crc kubenswrapper[4778]: I1205 16:08:30.393755 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" event={"ID":"4b63bba1-6d00-4e21-81b5-9c2573000afd","Type":"ContainerStarted","Data":"a17124a8afde967d8d6321e7cae7b8eeb4c235d2b74809ed72e9fef488e9a564"} Dec 05 16:08:30 crc kubenswrapper[4778]: I1205 16:08:30.393801 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" event={"ID":"4b63bba1-6d00-4e21-81b5-9c2573000afd","Type":"ContainerStarted","Data":"2adab42be8bbe407aaa1bf9d7683bdb29727ee0c9359fcebc987a87d39fa25e5"} Dec 05 16:08:31 crc kubenswrapper[4778]: I1205 16:08:31.401562 4778 generic.go:334] "Generic (PLEG): container finished" podID="4b63bba1-6d00-4e21-81b5-9c2573000afd" containerID="a17124a8afde967d8d6321e7cae7b8eeb4c235d2b74809ed72e9fef488e9a564" exitCode=0 Dec 05 16:08:31 crc kubenswrapper[4778]: I1205 16:08:31.401607 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" event={"ID":"4b63bba1-6d00-4e21-81b5-9c2573000afd","Type":"ContainerDied","Data":"a17124a8afde967d8d6321e7cae7b8eeb4c235d2b74809ed72e9fef488e9a564"} Dec 05 16:08:31 crc kubenswrapper[4778]: I1205 16:08:31.799191 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wkbtn"] Dec 05 16:08:31 crc kubenswrapper[4778]: I1205 16:08:31.800627 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:31 crc kubenswrapper[4778]: I1205 16:08:31.833359 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkbtn"] Dec 05 16:08:31 crc kubenswrapper[4778]: I1205 16:08:31.905612 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7bnc\" (UniqueName: \"kubernetes.io/projected/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-kube-api-access-d7bnc\") pod \"redhat-operators-wkbtn\" (UID: \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\") " pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:31 crc kubenswrapper[4778]: I1205 16:08:31.905690 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-catalog-content\") pod \"redhat-operators-wkbtn\" (UID: \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\") " pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:31 crc kubenswrapper[4778]: I1205 16:08:31.905835 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-utilities\") pod \"redhat-operators-wkbtn\" (UID: \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\") " pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:32 crc kubenswrapper[4778]: I1205 16:08:32.007179 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7bnc\" (UniqueName: \"kubernetes.io/projected/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-kube-api-access-d7bnc\") pod \"redhat-operators-wkbtn\" (UID: \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\") " pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:32 crc kubenswrapper[4778]: I1205 16:08:32.007221 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-catalog-content\") pod \"redhat-operators-wkbtn\" (UID: \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\") " pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:32 crc kubenswrapper[4778]: I1205 16:08:32.007268 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-utilities\") pod \"redhat-operators-wkbtn\" (UID: \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\") " pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:32 crc kubenswrapper[4778]: I1205 16:08:32.007711 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-utilities\") pod \"redhat-operators-wkbtn\" (UID: \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\") " pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:32 crc kubenswrapper[4778]: I1205 16:08:32.007847 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-catalog-content\") pod \"redhat-operators-wkbtn\" (UID: \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\") " pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:32 crc kubenswrapper[4778]: I1205 16:08:32.030865 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7bnc\" (UniqueName: \"kubernetes.io/projected/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-kube-api-access-d7bnc\") pod \"redhat-operators-wkbtn\" (UID: \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\") " pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:32 crc kubenswrapper[4778]: I1205 16:08:32.123076 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:32 crc kubenswrapper[4778]: I1205 16:08:32.317442 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkbtn"] Dec 05 16:08:32 crc kubenswrapper[4778]: I1205 16:08:32.408698 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkbtn" event={"ID":"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2","Type":"ContainerStarted","Data":"5604f91b7eacfece64261900466ff7d9452690074630bdaa930285ec55124a07"} Dec 05 16:08:33 crc kubenswrapper[4778]: I1205 16:08:33.417895 4778 generic.go:334] "Generic (PLEG): container finished" podID="4b63bba1-6d00-4e21-81b5-9c2573000afd" containerID="d828f5e4ba0ceb56919ae998ab594e02196b09a34f4d61dcafeb930e2fe2a557" exitCode=0 Dec 05 16:08:33 crc kubenswrapper[4778]: I1205 16:08:33.417958 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" event={"ID":"4b63bba1-6d00-4e21-81b5-9c2573000afd","Type":"ContainerDied","Data":"d828f5e4ba0ceb56919ae998ab594e02196b09a34f4d61dcafeb930e2fe2a557"} Dec 05 16:08:33 crc kubenswrapper[4778]: I1205 16:08:33.421994 4778 generic.go:334] "Generic (PLEG): container finished" podID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" containerID="fb596eac147585056d4914258be871aefa8ee2d72213144ea01d7402b4fdf1b0" exitCode=0 Dec 05 16:08:33 crc kubenswrapper[4778]: I1205 16:08:33.422053 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkbtn" event={"ID":"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2","Type":"ContainerDied","Data":"fb596eac147585056d4914258be871aefa8ee2d72213144ea01d7402b4fdf1b0"} Dec 05 16:08:34 crc kubenswrapper[4778]: I1205 16:08:34.438687 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkbtn" event={"ID":"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2","Type":"ContainerStarted","Data":"200d1621eb3a565fb18bb36f8914db9df08d32772ef4d49b9fab09a21e87e976"} Dec 05 16:08:34 crc kubenswrapper[4778]: I1205 16:08:34.440467 4778 generic.go:334] "Generic (PLEG): container finished" podID="4b63bba1-6d00-4e21-81b5-9c2573000afd" containerID="dbdef390401e00f03af18ea4c45de16974826a37b5f4f127df0679eedab15d85" exitCode=0 Dec 05 16:08:34 crc kubenswrapper[4778]: I1205 16:08:34.440503 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" event={"ID":"4b63bba1-6d00-4e21-81b5-9c2573000afd","Type":"ContainerDied","Data":"dbdef390401e00f03af18ea4c45de16974826a37b5f4f127df0679eedab15d85"} Dec 05 16:08:35 crc kubenswrapper[4778]: I1205 16:08:35.452774 4778 generic.go:334] "Generic (PLEG): container finished" podID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" containerID="200d1621eb3a565fb18bb36f8914db9df08d32772ef4d49b9fab09a21e87e976" exitCode=0 Dec 05 16:08:35 crc kubenswrapper[4778]: I1205 16:08:35.452965 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkbtn" event={"ID":"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2","Type":"ContainerDied","Data":"200d1621eb3a565fb18bb36f8914db9df08d32772ef4d49b9fab09a21e87e976"} Dec 05 16:08:35 crc kubenswrapper[4778]: I1205 16:08:35.673440 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:35 crc kubenswrapper[4778]: I1205 16:08:35.752917 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b63bba1-6d00-4e21-81b5-9c2573000afd-util\") pod \"4b63bba1-6d00-4e21-81b5-9c2573000afd\" (UID: \"4b63bba1-6d00-4e21-81b5-9c2573000afd\") " Dec 05 16:08:35 crc kubenswrapper[4778]: I1205 16:08:35.752962 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b63bba1-6d00-4e21-81b5-9c2573000afd-bundle\") pod \"4b63bba1-6d00-4e21-81b5-9c2573000afd\" (UID: \"4b63bba1-6d00-4e21-81b5-9c2573000afd\") " Dec 05 16:08:35 crc kubenswrapper[4778]: I1205 16:08:35.753010 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7tls\" (UniqueName: \"kubernetes.io/projected/4b63bba1-6d00-4e21-81b5-9c2573000afd-kube-api-access-g7tls\") pod \"4b63bba1-6d00-4e21-81b5-9c2573000afd\" (UID: \"4b63bba1-6d00-4e21-81b5-9c2573000afd\") " Dec 05 16:08:35 crc kubenswrapper[4778]: I1205 16:08:35.755561 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b63bba1-6d00-4e21-81b5-9c2573000afd-bundle" (OuterVolumeSpecName: "bundle") pod "4b63bba1-6d00-4e21-81b5-9c2573000afd" (UID: "4b63bba1-6d00-4e21-81b5-9c2573000afd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:08:35 crc kubenswrapper[4778]: I1205 16:08:35.762637 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b63bba1-6d00-4e21-81b5-9c2573000afd-kube-api-access-g7tls" (OuterVolumeSpecName: "kube-api-access-g7tls") pod "4b63bba1-6d00-4e21-81b5-9c2573000afd" (UID: "4b63bba1-6d00-4e21-81b5-9c2573000afd"). InnerVolumeSpecName "kube-api-access-g7tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:08:35 crc kubenswrapper[4778]: I1205 16:08:35.854806 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b63bba1-6d00-4e21-81b5-9c2573000afd-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:35 crc kubenswrapper[4778]: I1205 16:08:35.854843 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7tls\" (UniqueName: \"kubernetes.io/projected/4b63bba1-6d00-4e21-81b5-9c2573000afd-kube-api-access-g7tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:36 crc kubenswrapper[4778]: I1205 16:08:36.159081 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b63bba1-6d00-4e21-81b5-9c2573000afd-util" (OuterVolumeSpecName: "util") pod "4b63bba1-6d00-4e21-81b5-9c2573000afd" (UID: "4b63bba1-6d00-4e21-81b5-9c2573000afd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:08:36 crc kubenswrapper[4778]: I1205 16:08:36.159813 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b63bba1-6d00-4e21-81b5-9c2573000afd-util\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:36 crc kubenswrapper[4778]: I1205 16:08:36.461713 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkbtn" event={"ID":"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2","Type":"ContainerStarted","Data":"bbd92b6847924366eea198b06edd162c280e21aae84bdd4c52b53a29a157a359"} Dec 05 16:08:36 crc kubenswrapper[4778]: I1205 16:08:36.465141 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" event={"ID":"4b63bba1-6d00-4e21-81b5-9c2573000afd","Type":"ContainerDied","Data":"2adab42be8bbe407aaa1bf9d7683bdb29727ee0c9359fcebc987a87d39fa25e5"} Dec 05 16:08:36 crc kubenswrapper[4778]: I1205 16:08:36.465177 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2adab42be8bbe407aaa1bf9d7683bdb29727ee0c9359fcebc987a87d39fa25e5" Dec 05 16:08:36 crc kubenswrapper[4778]: I1205 16:08:36.465195 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn" Dec 05 16:08:36 crc kubenswrapper[4778]: I1205 16:08:36.483285 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wkbtn" podStartSLOduration=2.731999113 podStartE2EDuration="5.483263851s" podCreationTimestamp="2025-12-05 16:08:31 +0000 UTC" firstStartedPulling="2025-12-05 16:08:33.423898501 +0000 UTC m=+800.527694921" lastFinishedPulling="2025-12-05 16:08:36.175163269 +0000 UTC m=+803.278959659" observedRunningTime="2025-12-05 16:08:36.478009355 +0000 UTC m=+803.581805745" watchObservedRunningTime="2025-12-05 16:08:36.483263851 +0000 UTC m=+803.587060261" Dec 05 16:08:40 crc kubenswrapper[4778]: I1205 16:08:40.324386 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vzs5q"] Dec 05 16:08:40 crc kubenswrapper[4778]: I1205 16:08:40.325634 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovn-controller" containerID="cri-o://e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587" gracePeriod=30 Dec 05 16:08:40 crc kubenswrapper[4778]: I1205 16:08:40.325726 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="nbdb" containerID="cri-o://5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910" gracePeriod=30 Dec 05 16:08:40 crc kubenswrapper[4778]: I1205 16:08:40.325806 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="kube-rbac-proxy-node" containerID="cri-o://6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817" gracePeriod=30 Dec 05 16:08:40 crc kubenswrapper[4778]: I1205 16:08:40.325782 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13" gracePeriod=30 Dec 05 16:08:40 crc kubenswrapper[4778]: I1205 16:08:40.325859 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovn-acl-logging" containerID="cri-o://9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417" gracePeriod=30 Dec 05 16:08:40 crc kubenswrapper[4778]: I1205 16:08:40.325835 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="sbdb" containerID="cri-o://cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad" gracePeriod=30 Dec 05 16:08:40 crc kubenswrapper[4778]: I1205 16:08:40.325948 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="northd" containerID="cri-o://408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a" gracePeriod=30 Dec 05 16:08:40 crc kubenswrapper[4778]: I1205 16:08:40.379267 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" containerID="cri-o://ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5" gracePeriod=30 Dec 05 16:08:42 crc kubenswrapper[4778]: I1205 16:08:42.123988 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:42 crc kubenswrapper[4778]: I1205 16:08:42.124671 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:43 crc kubenswrapper[4778]: I1205 16:08:43.167558 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wkbtn" podUID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" containerName="registry-server" probeResult="failure" output=< Dec 05 16:08:43 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Dec 05 16:08:43 crc kubenswrapper[4778]: > Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.034292 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad is running failed: container process not found" containerID="cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.034736 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910 is running failed: container process not found" containerID="5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.034944 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910 is running failed: container process not found" containerID="5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.035036 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad is running failed: container process not found" containerID="cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.035137 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910 is running failed: container process not found" containerID="5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.035168 4778 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="nbdb" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.035260 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad is running failed: container process not found" containerID="cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.035285 4778 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="sbdb" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.423209 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/3.log" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.426728 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovn-acl-logging/0.log" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.427233 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovn-controller/0.log" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.427683 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.455885 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-systemd-units\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.455937 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-node-log\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.455992 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovn-node-metrics-cert\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456016 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-openvswitch\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456039 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-var-lib-openvswitch\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456008 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456082 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456037 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-node-log" (OuterVolumeSpecName: "node-log") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456067 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czbcx\" (UniqueName: \"kubernetes.io/projected/6837b168-c691-4e7e-a211-a0c8ef0534e2-kube-api-access-czbcx\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456197 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-slash\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456241 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-etc-openvswitch\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456260 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-cni-netd\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456302 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-run-ovn-kubernetes\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456341 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456400 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-env-overrides\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456426 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-log-socket\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456454 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-cni-bin\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456473 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-ovn\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456497 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovnkube-script-lib\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456522 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-systemd\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456545 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-kubelet\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456583 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovnkube-config\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456608 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-run-netns\") pod \"6837b168-c691-4e7e-a211-a0c8ef0534e2\" (UID: \"6837b168-c691-4e7e-a211-a0c8ef0534e2\") " Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456950 4778 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456976 4778 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-node-log\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456991 4778 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456058 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.457008 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456910 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.457037 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456919 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-log-socket" (OuterVolumeSpecName: "log-socket") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456942 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.457055 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456956 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456967 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456977 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-slash" (OuterVolumeSpecName: "host-slash") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.456992 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.457150 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.457210 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.457266 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.461888 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.479507 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.480273 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6837b168-c691-4e7e-a211-a0c8ef0534e2-kube-api-access-czbcx" (OuterVolumeSpecName: "kube-api-access-czbcx") pod "6837b168-c691-4e7e-a211-a0c8ef0534e2" (UID: "6837b168-c691-4e7e-a211-a0c8ef0534e2"). InnerVolumeSpecName "kube-api-access-czbcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.483824 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bnrp8"] Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484020 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="nbdb" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484034 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="nbdb" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484042 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovn-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484048 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovn-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484060 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovn-acl-logging" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484066 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovn-acl-logging" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484075 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="sbdb" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484080 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="sbdb" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484085 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484091 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484100 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="kubecfg-setup" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484105 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="kubecfg-setup" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484114 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b63bba1-6d00-4e21-81b5-9c2573000afd" containerName="util" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484120 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b63bba1-6d00-4e21-81b5-9c2573000afd" containerName="util" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484126 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484131 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484138 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="kube-rbac-proxy-node" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484144 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="kube-rbac-proxy-node" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484150 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b63bba1-6d00-4e21-81b5-9c2573000afd" containerName="pull" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484155 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b63bba1-6d00-4e21-81b5-9c2573000afd" containerName="pull" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484164 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b63bba1-6d00-4e21-81b5-9c2573000afd" containerName="extract" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484169 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b63bba1-6d00-4e21-81b5-9c2573000afd" containerName="extract" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484177 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484182 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484194 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484199 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484206 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484212 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484221 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="northd" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484227 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="northd" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484310 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovn-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484321 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b63bba1-6d00-4e21-81b5-9c2573000afd" containerName="extract" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484328 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="northd" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484336 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovn-acl-logging" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484342 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484349 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="nbdb" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484354 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484381 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484392 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484398 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484406 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="sbdb" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484413 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484419 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="kube-rbac-proxy-node" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.484505 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.484512 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerName="ovnkube-controller" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.486212 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.506214 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrqmz_9b26d99a-f08e-41d1-b35c-5da99cbe3fb4/kube-multus/2.log" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.508230 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrqmz_9b26d99a-f08e-41d1-b35c-5da99cbe3fb4/kube-multus/1.log" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.508290 4778 generic.go:334] "Generic (PLEG): container finished" podID="9b26d99a-f08e-41d1-b35c-5da99cbe3fb4" containerID="755a407d01c202dcc90aac7a00034bea66f43c3d38847030c371b6abe04a171b" exitCode=2 Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.508641 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrqmz" event={"ID":"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4","Type":"ContainerDied","Data":"755a407d01c202dcc90aac7a00034bea66f43c3d38847030c371b6abe04a171b"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.508689 4778 scope.go:117] "RemoveContainer" containerID="14785ffbf4f13339aecf1ce19a16f571ead8c4db695f06b68174975c500483f3" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.509192 4778 scope.go:117] "RemoveContainer" containerID="755a407d01c202dcc90aac7a00034bea66f43c3d38847030c371b6abe04a171b" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.525176 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovnkube-controller/3.log" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.552016 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovn-acl-logging/0.log" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.552546 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vzs5q_6837b168-c691-4e7e-a211-a0c8ef0534e2/ovn-controller/0.log" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.552960 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5" exitCode=0 Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.552990 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad" exitCode=0 Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.552999 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910" exitCode=0 Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553008 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a" exitCode=0 Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553017 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13" exitCode=0 Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553025 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817" exitCode=0 Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553033 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417" exitCode=143 Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553040 4778 generic.go:334] "Generic (PLEG): container finished" podID="6837b168-c691-4e7e-a211-a0c8ef0534e2" containerID="e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587" exitCode=143 Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553059 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553087 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553102 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553115 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553127 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553138 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553150 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553159 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553164 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553169 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553174 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553179 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553183 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553189 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553193 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553198 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553205 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553215 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553223 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553229 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553236 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553242 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553249 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553255 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553262 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553268 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553274 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553285 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553295 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553303 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553309 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553315 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553323 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553328 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553334 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553339 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553344 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553349 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553356 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" event={"ID":"6837b168-c691-4e7e-a211-a0c8ef0534e2","Type":"ContainerDied","Data":"4c630a0c8da361dcdfcb7d75caedfc1537c7980111735ecef181b7ec88456343"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553381 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553389 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553396 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553402 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553408 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553415 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553421 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553427 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553433 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553439 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507"} Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.553529 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vzs5q" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558008 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-cni-bin\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-run-openvswitch\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558335 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-run-ovn\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558377 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqrrl\" (UniqueName: \"kubernetes.io/projected/032d5b4f-99e1-4e2e-96f7-488c04936404-kube-api-access-vqrrl\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558405 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-run-systemd\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558425 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/032d5b4f-99e1-4e2e-96f7-488c04936404-ovnkube-script-lib\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558451 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-kubelet\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558502 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-run-ovn-kubernetes\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558526 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-systemd-units\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558550 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/032d5b4f-99e1-4e2e-96f7-488c04936404-ovnkube-config\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558586 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-cni-netd\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558662 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-etc-openvswitch\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558699 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-node-log\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558722 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/032d5b4f-99e1-4e2e-96f7-488c04936404-ovn-node-metrics-cert\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558749 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-var-lib-openvswitch\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558790 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558817 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/032d5b4f-99e1-4e2e-96f7-488c04936404-env-overrides\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558846 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-run-netns\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558865 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-log-socket\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.558885 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-slash\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.559320 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.559443 4778 scope.go:117] "RemoveContainer" containerID="ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560713 4778 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560740 4778 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560753 4778 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560766 4778 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560777 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560789 4778 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560802 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6837b168-c691-4e7e-a211-a0c8ef0534e2-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560816 4778 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560828 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czbcx\" (UniqueName: \"kubernetes.io/projected/6837b168-c691-4e7e-a211-a0c8ef0534e2-kube-api-access-czbcx\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560839 4778 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-slash\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560850 4778 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560861 4778 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560877 4778 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560932 4778 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560945 4778 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6837b168-c691-4e7e-a211-a0c8ef0534e2-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.560976 4778 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6837b168-c691-4e7e-a211-a0c8ef0534e2-log-socket\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.608495 4778 scope.go:117] "RemoveContainer" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.619168 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vzs5q"] Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.635095 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vzs5q"] Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.641615 4778 scope.go:117] "RemoveContainer" containerID="cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.659160 4778 scope.go:117] "RemoveContainer" containerID="5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.662051 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.662227 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/032d5b4f-99e1-4e2e-96f7-488c04936404-env-overrides\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.662322 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-run-netns\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.662440 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-log-socket\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.662535 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-slash\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.662632 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-cni-bin\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.662806 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-run-openvswitch\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.662887 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-run-ovn\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.662966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqrrl\" (UniqueName: \"kubernetes.io/projected/032d5b4f-99e1-4e2e-96f7-488c04936404-kube-api-access-vqrrl\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.663080 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-run-systemd\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.663171 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/032d5b4f-99e1-4e2e-96f7-488c04936404-ovnkube-script-lib\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.663260 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-kubelet\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.663393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-run-ovn-kubernetes\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.663479 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-systemd-units\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.666166 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/032d5b4f-99e1-4e2e-96f7-488c04936404-ovnkube-config\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.666336 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-cni-netd\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.666473 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-node-log\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.666559 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/032d5b4f-99e1-4e2e-96f7-488c04936404-ovn-node-metrics-cert\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.666646 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-etc-openvswitch\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.666736 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-var-lib-openvswitch\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.666922 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-var-lib-openvswitch\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.667063 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.667619 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-run-netns\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.667724 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-log-socket\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.667819 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-slash\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.667919 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-cni-bin\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.668009 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-run-openvswitch\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.668093 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-run-ovn\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.668430 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-systemd-units\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.668511 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-run-systemd\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.669090 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/032d5b4f-99e1-4e2e-96f7-488c04936404-ovnkube-config\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.669384 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-node-log\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.669476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-cni-netd\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.669524 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-kubelet\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.669562 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-host-run-ovn-kubernetes\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.669590 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/032d5b4f-99e1-4e2e-96f7-488c04936404-etc-openvswitch\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.669739 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/032d5b4f-99e1-4e2e-96f7-488c04936404-ovnkube-script-lib\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.674631 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/032d5b4f-99e1-4e2e-96f7-488c04936404-ovn-node-metrics-cert\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.680549 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/032d5b4f-99e1-4e2e-96f7-488c04936404-env-overrides\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.683541 4778 scope.go:117] "RemoveContainer" containerID="408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.690382 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqrrl\" (UniqueName: \"kubernetes.io/projected/032d5b4f-99e1-4e2e-96f7-488c04936404-kube-api-access-vqrrl\") pod \"ovnkube-node-bnrp8\" (UID: \"032d5b4f-99e1-4e2e-96f7-488c04936404\") " pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.704671 4778 scope.go:117] "RemoveContainer" containerID="d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.722093 4778 scope.go:117] "RemoveContainer" containerID="6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.736779 4778 scope.go:117] "RemoveContainer" containerID="9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.749933 4778 scope.go:117] "RemoveContainer" containerID="e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.771024 4778 scope.go:117] "RemoveContainer" containerID="9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.791760 4778 scope.go:117] "RemoveContainer" containerID="ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.792775 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5\": container with ID starting with ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5 not found: ID does not exist" containerID="ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.792817 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5"} err="failed to get container status \"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5\": rpc error: code = NotFound desc = could not find container \"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5\": container with ID starting with ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.792837 4778 scope.go:117] "RemoveContainer" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.793317 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\": container with ID starting with 3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751 not found: ID does not exist" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.793389 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751"} err="failed to get container status \"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\": rpc error: code = NotFound desc = could not find container \"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\": container with ID starting with 3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.793423 4778 scope.go:117] "RemoveContainer" containerID="cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.797753 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\": container with ID starting with cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad not found: ID does not exist" containerID="cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.797793 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad"} err="failed to get container status \"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\": rpc error: code = NotFound desc = could not find container \"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\": container with ID starting with cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.797817 4778 scope.go:117] "RemoveContainer" containerID="5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.798279 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\": container with ID starting with 5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910 not found: ID does not exist" containerID="5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.798317 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910"} err="failed to get container status \"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\": rpc error: code = NotFound desc = could not find container \"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\": container with ID starting with 5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.798337 4778 scope.go:117] "RemoveContainer" containerID="408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.798719 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\": container with ID starting with 408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a not found: ID does not exist" containerID="408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.798775 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a"} err="failed to get container status \"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\": rpc error: code = NotFound desc = could not find container \"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\": container with ID starting with 408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.798811 4778 scope.go:117] "RemoveContainer" containerID="d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.799133 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\": container with ID starting with d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13 not found: ID does not exist" containerID="d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.799157 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13"} err="failed to get container status \"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\": rpc error: code = NotFound desc = could not find container \"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\": container with ID starting with d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.799174 4778 scope.go:117] "RemoveContainer" containerID="6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.799428 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\": container with ID starting with 6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817 not found: ID does not exist" containerID="6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.799454 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817"} err="failed to get container status \"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\": rpc error: code = NotFound desc = could not find container \"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\": container with ID starting with 6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.799471 4778 scope.go:117] "RemoveContainer" containerID="9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.799797 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\": container with ID starting with 9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417 not found: ID does not exist" containerID="9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.799820 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417"} err="failed to get container status \"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\": rpc error: code = NotFound desc = could not find container \"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\": container with ID starting with 9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.799832 4778 scope.go:117] "RemoveContainer" containerID="e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.800066 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\": container with ID starting with e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587 not found: ID does not exist" containerID="e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.800086 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587"} err="failed to get container status \"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\": rpc error: code = NotFound desc = could not find container \"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\": container with ID starting with e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.800097 4778 scope.go:117] "RemoveContainer" containerID="9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507" Dec 05 16:08:44 crc kubenswrapper[4778]: E1205 16:08:44.800352 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\": container with ID starting with 9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507 not found: ID does not exist" containerID="9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.800383 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507"} err="failed to get container status \"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\": rpc error: code = NotFound desc = could not find container \"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\": container with ID starting with 9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.800395 4778 scope.go:117] "RemoveContainer" containerID="ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.800676 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5"} err="failed to get container status \"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5\": rpc error: code = NotFound desc = could not find container \"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5\": container with ID starting with ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.800693 4778 scope.go:117] "RemoveContainer" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.800976 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751"} err="failed to get container status \"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\": rpc error: code = NotFound desc = could not find container \"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\": container with ID starting with 3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.800992 4778 scope.go:117] "RemoveContainer" containerID="cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.801212 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad"} err="failed to get container status \"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\": rpc error: code = NotFound desc = could not find container \"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\": container with ID starting with cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.801229 4778 scope.go:117] "RemoveContainer" containerID="5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.801557 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910"} err="failed to get container status \"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\": rpc error: code = NotFound desc = could not find container \"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\": container with ID starting with 5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.801598 4778 scope.go:117] "RemoveContainer" containerID="408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.801894 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a"} err="failed to get container status \"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\": rpc error: code = NotFound desc = could not find container \"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\": container with ID starting with 408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.801914 4778 scope.go:117] "RemoveContainer" containerID="d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.802176 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13"} err="failed to get container status \"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\": rpc error: code = NotFound desc = could not find container \"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\": container with ID starting with d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.802195 4778 scope.go:117] "RemoveContainer" containerID="6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.802422 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817"} err="failed to get container status \"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\": rpc error: code = NotFound desc = could not find container \"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\": container with ID starting with 6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.802445 4778 scope.go:117] "RemoveContainer" containerID="9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.806828 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417"} err="failed to get container status \"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\": rpc error: code = NotFound desc = could not find container \"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\": container with ID starting with 9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.806883 4778 scope.go:117] "RemoveContainer" containerID="e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.807355 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587"} err="failed to get container status \"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\": rpc error: code = NotFound desc = could not find container \"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\": container with ID starting with e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.807409 4778 scope.go:117] "RemoveContainer" containerID="9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.807692 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507"} err="failed to get container status \"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\": rpc error: code = NotFound desc = could not find container \"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\": container with ID starting with 9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.807713 4778 scope.go:117] "RemoveContainer" containerID="ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.807996 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5"} err="failed to get container status \"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5\": rpc error: code = NotFound desc = could not find container \"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5\": container with ID starting with ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.808014 4778 scope.go:117] "RemoveContainer" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.808282 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751"} err="failed to get container status \"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\": rpc error: code = NotFound desc = could not find container \"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\": container with ID starting with 3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.808303 4778 scope.go:117] "RemoveContainer" containerID="cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.808648 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad"} err="failed to get container status \"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\": rpc error: code = NotFound desc = could not find container \"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\": container with ID starting with cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.808667 4778 scope.go:117] "RemoveContainer" containerID="5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.808908 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910"} err="failed to get container status \"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\": rpc error: code = NotFound desc = could not find container \"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\": container with ID starting with 5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.808926 4778 scope.go:117] "RemoveContainer" containerID="408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.809307 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a"} err="failed to get container status \"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\": rpc error: code = NotFound desc = could not find container \"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\": container with ID starting with 408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.809332 4778 scope.go:117] "RemoveContainer" containerID="d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.811578 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13"} err="failed to get container status \"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\": rpc error: code = NotFound desc = could not find container \"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\": container with ID starting with d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.811604 4778 scope.go:117] "RemoveContainer" containerID="6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.811812 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817"} err="failed to get container status \"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\": rpc error: code = NotFound desc = could not find container \"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\": container with ID starting with 6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.811832 4778 scope.go:117] "RemoveContainer" containerID="9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.812049 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417"} err="failed to get container status \"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\": rpc error: code = NotFound desc = could not find container \"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\": container with ID starting with 9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.812071 4778 scope.go:117] "RemoveContainer" containerID="e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.812522 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.814688 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587"} err="failed to get container status \"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\": rpc error: code = NotFound desc = could not find container \"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\": container with ID starting with e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.814716 4778 scope.go:117] "RemoveContainer" containerID="9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.815146 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507"} err="failed to get container status \"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\": rpc error: code = NotFound desc = could not find container \"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\": container with ID starting with 9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.815167 4778 scope.go:117] "RemoveContainer" containerID="ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.815431 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5"} err="failed to get container status \"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5\": rpc error: code = NotFound desc = could not find container \"ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5\": container with ID starting with ea86ffbf019eaf584431e15c67fc388800ebf78fdb88ce3b3aac2bc94ee277d5 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.815449 4778 scope.go:117] "RemoveContainer" containerID="3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.815704 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751"} err="failed to get container status \"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\": rpc error: code = NotFound desc = could not find container \"3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751\": container with ID starting with 3751ff6c2307b472e2e5c882d92c4baa49b3ada5ceb0f7aa937eb628e761b751 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.815728 4778 scope.go:117] "RemoveContainer" containerID="cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.816067 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad"} err="failed to get container status \"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\": rpc error: code = NotFound desc = could not find container \"cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad\": container with ID starting with cf843ac1288a7c40d22d79f412df9edfd46624a8d07869ef10164c5a782119ad not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.816103 4778 scope.go:117] "RemoveContainer" containerID="5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.816554 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910"} err="failed to get container status \"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\": rpc error: code = NotFound desc = could not find container \"5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910\": container with ID starting with 5a54caff46ed9f901b457f47d98ef14af3e6594d8b0e2378da405f134f06a910 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.816574 4778 scope.go:117] "RemoveContainer" containerID="408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.816798 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a"} err="failed to get container status \"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\": rpc error: code = NotFound desc = could not find container \"408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a\": container with ID starting with 408326bfbf4ec5d4a1ba10149c806d21d190420f66bdf4076617461a0fb2890a not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.816819 4778 scope.go:117] "RemoveContainer" containerID="d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.817081 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13"} err="failed to get container status \"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\": rpc error: code = NotFound desc = could not find container \"d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13\": container with ID starting with d97f89f704e34d747edb85fe210df598f033a76b07084add81591eb65da6fb13 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.817099 4778 scope.go:117] "RemoveContainer" containerID="6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.817379 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817"} err="failed to get container status \"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\": rpc error: code = NotFound desc = could not find container \"6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817\": container with ID starting with 6615056dcaeb10d2128099a57081d30c932d0ca40ea99fda21fe80bbb7b6c817 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.817396 4778 scope.go:117] "RemoveContainer" containerID="9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.821714 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417"} err="failed to get container status \"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\": rpc error: code = NotFound desc = could not find container \"9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417\": container with ID starting with 9721b7eda061e690941b934d7c62c010d3a61a3264299245a3d2ad8f542e3417 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.821751 4778 scope.go:117] "RemoveContainer" containerID="e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.822155 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587"} err="failed to get container status \"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\": rpc error: code = NotFound desc = could not find container \"e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587\": container with ID starting with e746f8c064e4332720555426df58570dede431da8d7b54ad3bf5e600f73ce587 not found: ID does not exist" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.822189 4778 scope.go:117] "RemoveContainer" containerID="9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507" Dec 05 16:08:44 crc kubenswrapper[4778]: I1205 16:08:44.822441 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507"} err="failed to get container status \"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\": rpc error: code = NotFound desc = could not find container \"9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507\": container with ID starting with 9564e6b31f54699af93c295da382b5764828b4b00dbda5463ace1d3883ca4507 not found: ID does not exist" Dec 05 16:08:45 crc kubenswrapper[4778]: I1205 16:08:45.255890 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6837b168-c691-4e7e-a211-a0c8ef0534e2" path="/var/lib/kubelet/pods/6837b168-c691-4e7e-a211-a0c8ef0534e2/volumes" Dec 05 16:08:45 crc kubenswrapper[4778]: I1205 16:08:45.558706 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrqmz_9b26d99a-f08e-41d1-b35c-5da99cbe3fb4/kube-multus/2.log" Dec 05 16:08:45 crc kubenswrapper[4778]: I1205 16:08:45.560717 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" event={"ID":"032d5b4f-99e1-4e2e-96f7-488c04936404","Type":"ContainerStarted","Data":"bf810c492a112d8f8aae9eb764fb4a93c540f79123aee3c6f54da1b8f49d0fec"} Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.700256 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p"] Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.700996 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.707380 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.708954 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.709502 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rklcx" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.757438 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf"] Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.758030 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.759673 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.762061 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-j7n7n" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.771714 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v"] Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.772539 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.792316 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22a6d7c0-787b-4db2-b559-a15f91626619-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v\" (UID: \"22a6d7c0-787b-4db2-b559-a15f91626619\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.792401 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06f22ef3-dbf4-44cd-bd3c-099d4b23c440-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf\" (UID: \"06f22ef3-dbf4-44cd-bd3c-099d4b23c440\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.792429 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06f22ef3-dbf4-44cd-bd3c-099d4b23c440-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf\" (UID: \"06f22ef3-dbf4-44cd-bd3c-099d4b23c440\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.792485 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzfpz\" (UniqueName: \"kubernetes.io/projected/7d00eb5e-6107-4c91-b9ed-540833e16404-kube-api-access-qzfpz\") pod \"obo-prometheus-operator-668cf9dfbb-phf8p\" (UID: \"7d00eb5e-6107-4c91-b9ed-540833e16404\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.792572 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22a6d7c0-787b-4db2-b559-a15f91626619-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v\" (UID: \"22a6d7c0-787b-4db2-b559-a15f91626619\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.893595 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzfpz\" (UniqueName: \"kubernetes.io/projected/7d00eb5e-6107-4c91-b9ed-540833e16404-kube-api-access-qzfpz\") pod \"obo-prometheus-operator-668cf9dfbb-phf8p\" (UID: \"7d00eb5e-6107-4c91-b9ed-540833e16404\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.893640 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22a6d7c0-787b-4db2-b559-a15f91626619-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v\" (UID: \"22a6d7c0-787b-4db2-b559-a15f91626619\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.893672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22a6d7c0-787b-4db2-b559-a15f91626619-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v\" (UID: \"22a6d7c0-787b-4db2-b559-a15f91626619\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.893708 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06f22ef3-dbf4-44cd-bd3c-099d4b23c440-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf\" (UID: \"06f22ef3-dbf4-44cd-bd3c-099d4b23c440\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.893727 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06f22ef3-dbf4-44cd-bd3c-099d4b23c440-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf\" (UID: \"06f22ef3-dbf4-44cd-bd3c-099d4b23c440\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.903166 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06f22ef3-dbf4-44cd-bd3c-099d4b23c440-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf\" (UID: \"06f22ef3-dbf4-44cd-bd3c-099d4b23c440\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.903165 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06f22ef3-dbf4-44cd-bd3c-099d4b23c440-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf\" (UID: \"06f22ef3-dbf4-44cd-bd3c-099d4b23c440\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.910002 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22a6d7c0-787b-4db2-b559-a15f91626619-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v\" (UID: \"22a6d7c0-787b-4db2-b559-a15f91626619\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.912942 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22a6d7c0-787b-4db2-b559-a15f91626619-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v\" (UID: \"22a6d7c0-787b-4db2-b559-a15f91626619\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.932875 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzfpz\" (UniqueName: \"kubernetes.io/projected/7d00eb5e-6107-4c91-b9ed-540833e16404-kube-api-access-qzfpz\") pod \"obo-prometheus-operator-668cf9dfbb-phf8p\" (UID: \"7d00eb5e-6107-4c91-b9ed-540833e16404\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.954175 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-5xcd4"] Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.954996 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.956935 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-x8cf6" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.956939 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.994708 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfs8v\" (UniqueName: \"kubernetes.io/projected/8e816989-62eb-47ce-a33b-2f09f1d2b3c6-kube-api-access-sfs8v\") pod \"observability-operator-d8bb48f5d-5xcd4\" (UID: \"8e816989-62eb-47ce-a33b-2f09f1d2b3c6\") " pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:46 crc kubenswrapper[4778]: I1205 16:08:46.994886 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e816989-62eb-47ce-a33b-2f09f1d2b3c6-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-5xcd4\" (UID: \"8e816989-62eb-47ce-a33b-2f09f1d2b3c6\") " pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.022886 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.034384 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-nvd8k"] Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.035051 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.038749 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-t9qps" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.044119 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-phf8p_openshift-operators_7d00eb5e-6107-4c91-b9ed-540833e16404_0(c94a575602e35cfaf54d918163095ebb73f58e36443c876937a1515efa68a5d6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.044175 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-phf8p_openshift-operators_7d00eb5e-6107-4c91-b9ed-540833e16404_0(c94a575602e35cfaf54d918163095ebb73f58e36443c876937a1515efa68a5d6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.044197 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-phf8p_openshift-operators_7d00eb5e-6107-4c91-b9ed-540833e16404_0(c94a575602e35cfaf54d918163095ebb73f58e36443c876937a1515efa68a5d6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.044240 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-phf8p_openshift-operators(7d00eb5e-6107-4c91-b9ed-540833e16404)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-phf8p_openshift-operators(7d00eb5e-6107-4c91-b9ed-540833e16404)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-phf8p_openshift-operators_7d00eb5e-6107-4c91-b9ed-540833e16404_0(c94a575602e35cfaf54d918163095ebb73f58e36443c876937a1515efa68a5d6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" podUID="7d00eb5e-6107-4c91-b9ed-540833e16404" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.072127 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.087530 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.096152 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/da29284c-59ce-4aed-a280-0b9b550c2c96-openshift-service-ca\") pod \"perses-operator-5446b9c989-nvd8k\" (UID: \"da29284c-59ce-4aed-a280-0b9b550c2c96\") " pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.096211 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfs8v\" (UniqueName: \"kubernetes.io/projected/8e816989-62eb-47ce-a33b-2f09f1d2b3c6-kube-api-access-sfs8v\") pod \"observability-operator-d8bb48f5d-5xcd4\" (UID: \"8e816989-62eb-47ce-a33b-2f09f1d2b3c6\") " pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.096248 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e816989-62eb-47ce-a33b-2f09f1d2b3c6-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-5xcd4\" (UID: \"8e816989-62eb-47ce-a33b-2f09f1d2b3c6\") " pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.096274 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx6mn\" (UniqueName: \"kubernetes.io/projected/da29284c-59ce-4aed-a280-0b9b550c2c96-kube-api-access-nx6mn\") pod \"perses-operator-5446b9c989-nvd8k\" (UID: \"da29284c-59ce-4aed-a280-0b9b550c2c96\") " pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.098873 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_openshift-operators_06f22ef3-dbf4-44cd-bd3c-099d4b23c440_0(f464d96e80fd442479a7c02f0e2ff23225be6e81bba5be99e6f2038ccca3e238): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.098927 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_openshift-operators_06f22ef3-dbf4-44cd-bd3c-099d4b23c440_0(f464d96e80fd442479a7c02f0e2ff23225be6e81bba5be99e6f2038ccca3e238): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.098950 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_openshift-operators_06f22ef3-dbf4-44cd-bd3c-099d4b23c440_0(f464d96e80fd442479a7c02f0e2ff23225be6e81bba5be99e6f2038ccca3e238): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.098996 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_openshift-operators(06f22ef3-dbf4-44cd-bd3c-099d4b23c440)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_openshift-operators(06f22ef3-dbf4-44cd-bd3c-099d4b23c440)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_openshift-operators_06f22ef3-dbf4-44cd-bd3c-099d4b23c440_0(f464d96e80fd442479a7c02f0e2ff23225be6e81bba5be99e6f2038ccca3e238): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" podUID="06f22ef3-dbf4-44cd-bd3c-099d4b23c440" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.099486 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e816989-62eb-47ce-a33b-2f09f1d2b3c6-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-5xcd4\" (UID: \"8e816989-62eb-47ce-a33b-2f09f1d2b3c6\") " pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.111180 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfs8v\" (UniqueName: \"kubernetes.io/projected/8e816989-62eb-47ce-a33b-2f09f1d2b3c6-kube-api-access-sfs8v\") pod \"observability-operator-d8bb48f5d-5xcd4\" (UID: \"8e816989-62eb-47ce-a33b-2f09f1d2b3c6\") " pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.117626 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_openshift-operators_22a6d7c0-787b-4db2-b559-a15f91626619_0(434fd075f3b06328c2d401a189a8860bfd9ce61f84262f7f85456353420fec50): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.117694 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_openshift-operators_22a6d7c0-787b-4db2-b559-a15f91626619_0(434fd075f3b06328c2d401a189a8860bfd9ce61f84262f7f85456353420fec50): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.117718 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_openshift-operators_22a6d7c0-787b-4db2-b559-a15f91626619_0(434fd075f3b06328c2d401a189a8860bfd9ce61f84262f7f85456353420fec50): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.117770 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_openshift-operators(22a6d7c0-787b-4db2-b559-a15f91626619)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_openshift-operators(22a6d7c0-787b-4db2-b559-a15f91626619)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_openshift-operators_22a6d7c0-787b-4db2-b559-a15f91626619_0(434fd075f3b06328c2d401a189a8860bfd9ce61f84262f7f85456353420fec50): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" podUID="22a6d7c0-787b-4db2-b559-a15f91626619" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.197807 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/da29284c-59ce-4aed-a280-0b9b550c2c96-openshift-service-ca\") pod \"perses-operator-5446b9c989-nvd8k\" (UID: \"da29284c-59ce-4aed-a280-0b9b550c2c96\") " pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.197911 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx6mn\" (UniqueName: \"kubernetes.io/projected/da29284c-59ce-4aed-a280-0b9b550c2c96-kube-api-access-nx6mn\") pod \"perses-operator-5446b9c989-nvd8k\" (UID: \"da29284c-59ce-4aed-a280-0b9b550c2c96\") " pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.199236 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/da29284c-59ce-4aed-a280-0b9b550c2c96-openshift-service-ca\") pod \"perses-operator-5446b9c989-nvd8k\" (UID: \"da29284c-59ce-4aed-a280-0b9b550c2c96\") " pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.212687 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx6mn\" (UniqueName: \"kubernetes.io/projected/da29284c-59ce-4aed-a280-0b9b550c2c96-kube-api-access-nx6mn\") pod \"perses-operator-5446b9c989-nvd8k\" (UID: \"da29284c-59ce-4aed-a280-0b9b550c2c96\") " pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.269261 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.300394 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5xcd4_openshift-operators_8e816989-62eb-47ce-a33b-2f09f1d2b3c6_0(f85193c381f071aea36852b0ef4feb78cf1de4de3d389f9507237b3eff1996d3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.300456 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5xcd4_openshift-operators_8e816989-62eb-47ce-a33b-2f09f1d2b3c6_0(f85193c381f071aea36852b0ef4feb78cf1de4de3d389f9507237b3eff1996d3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.300484 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5xcd4_openshift-operators_8e816989-62eb-47ce-a33b-2f09f1d2b3c6_0(f85193c381f071aea36852b0ef4feb78cf1de4de3d389f9507237b3eff1996d3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.300543 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-5xcd4_openshift-operators(8e816989-62eb-47ce-a33b-2f09f1d2b3c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-5xcd4_openshift-operators(8e816989-62eb-47ce-a33b-2f09f1d2b3c6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5xcd4_openshift-operators_8e816989-62eb-47ce-a33b-2f09f1d2b3c6_0(f85193c381f071aea36852b0ef4feb78cf1de4de3d389f9507237b3eff1996d3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" podUID="8e816989-62eb-47ce-a33b-2f09f1d2b3c6" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.378445 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.400659 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nvd8k_openshift-operators_da29284c-59ce-4aed-a280-0b9b550c2c96_0(4e8db73dcd2f89187058465512a76921491b2e141450e57bc36dbe4fc63b2b55): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.400722 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nvd8k_openshift-operators_da29284c-59ce-4aed-a280-0b9b550c2c96_0(4e8db73dcd2f89187058465512a76921491b2e141450e57bc36dbe4fc63b2b55): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.400754 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nvd8k_openshift-operators_da29284c-59ce-4aed-a280-0b9b550c2c96_0(4e8db73dcd2f89187058465512a76921491b2e141450e57bc36dbe4fc63b2b55): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:47 crc kubenswrapper[4778]: E1205 16:08:47.400801 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-nvd8k_openshift-operators(da29284c-59ce-4aed-a280-0b9b550c2c96)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-nvd8k_openshift-operators(da29284c-59ce-4aed-a280-0b9b550c2c96)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nvd8k_openshift-operators_da29284c-59ce-4aed-a280-0b9b550c2c96_0(4e8db73dcd2f89187058465512a76921491b2e141450e57bc36dbe4fc63b2b55): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" podUID="da29284c-59ce-4aed-a280-0b9b550c2c96" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.575004 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrqmz_9b26d99a-f08e-41d1-b35c-5da99cbe3fb4/kube-multus/2.log" Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.575089 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrqmz" event={"ID":"9b26d99a-f08e-41d1-b35c-5da99cbe3fb4","Type":"ContainerStarted","Data":"ee1f2729db217fdebe4ce39fd8921f4065a71ba6b353f0d973871e00233069b8"} Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.576612 4778 generic.go:334] "Generic (PLEG): container finished" podID="032d5b4f-99e1-4e2e-96f7-488c04936404" containerID="72eb23bcd16c83790ef6e219ef3a3571264afdbf849001117455aef22835b1c2" exitCode=0 Dec 05 16:08:47 crc kubenswrapper[4778]: I1205 16:08:47.576665 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" event={"ID":"032d5b4f-99e1-4e2e-96f7-488c04936404","Type":"ContainerDied","Data":"72eb23bcd16c83790ef6e219ef3a3571264afdbf849001117455aef22835b1c2"} Dec 05 16:08:48 crc kubenswrapper[4778]: I1205 16:08:48.586200 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" event={"ID":"032d5b4f-99e1-4e2e-96f7-488c04936404","Type":"ContainerStarted","Data":"1c4aa64d909122dd1486121a33ebb9e2867c781edf6f3cb97d1e5c148dc41e97"} Dec 05 16:08:48 crc kubenswrapper[4778]: I1205 16:08:48.586256 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" event={"ID":"032d5b4f-99e1-4e2e-96f7-488c04936404","Type":"ContainerStarted","Data":"ce2aeed8e50d4f6b6b51b8a166432a40307ea63e66c5028e9bbc79141e20cf9d"} Dec 05 16:08:48 crc kubenswrapper[4778]: I1205 16:08:48.586268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" event={"ID":"032d5b4f-99e1-4e2e-96f7-488c04936404","Type":"ContainerStarted","Data":"6eaeb534b8fc5297c957dfd4f30dd809ab28f9a0e15c8f3397d0ae700350478d"} Dec 05 16:08:48 crc kubenswrapper[4778]: I1205 16:08:48.586279 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" event={"ID":"032d5b4f-99e1-4e2e-96f7-488c04936404","Type":"ContainerStarted","Data":"ea9095fbf3c0fbeb32a07e564f1cc951208be94da6f6ba9fd58233385a696e34"} Dec 05 16:08:48 crc kubenswrapper[4778]: I1205 16:08:48.586291 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" event={"ID":"032d5b4f-99e1-4e2e-96f7-488c04936404","Type":"ContainerStarted","Data":"7589064f19d027ed55db3e0565a3083b48fdbfe31149de2ce6e5a1e3eeedb200"} Dec 05 16:08:48 crc kubenswrapper[4778]: I1205 16:08:48.586300 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" event={"ID":"032d5b4f-99e1-4e2e-96f7-488c04936404","Type":"ContainerStarted","Data":"1aa5686c8ed8597bc64fbe02c4c8228f708caa09bd4b91097efd09230325fe6a"} Dec 05 16:08:50 crc kubenswrapper[4778]: I1205 16:08:50.601136 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" event={"ID":"032d5b4f-99e1-4e2e-96f7-488c04936404","Type":"ContainerStarted","Data":"bfc5c5fb23a8b97de69f2c629e3791b725e7223c0875a2111963eb850e43c459"} Dec 05 16:08:52 crc kubenswrapper[4778]: I1205 16:08:52.179509 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:52 crc kubenswrapper[4778]: I1205 16:08:52.227720 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:53 crc kubenswrapper[4778]: I1205 16:08:53.187832 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wkbtn"] Dec 05 16:08:53 crc kubenswrapper[4778]: I1205 16:08:53.620717 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wkbtn" podUID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" containerName="registry-server" containerID="cri-o://bbd92b6847924366eea198b06edd162c280e21aae84bdd4c52b53a29a157a359" gracePeriod=2 Dec 05 16:08:53 crc kubenswrapper[4778]: I1205 16:08:53.622156 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" event={"ID":"032d5b4f-99e1-4e2e-96f7-488c04936404","Type":"ContainerStarted","Data":"eb1d099efea56b75d9a39bd5fad85c8a6f85dfcd2dc4cd97b5f4bdeceb0e6940"} Dec 05 16:08:53 crc kubenswrapper[4778]: I1205 16:08:53.622200 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:53 crc kubenswrapper[4778]: I1205 16:08:53.622526 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:53 crc kubenswrapper[4778]: I1205 16:08:53.622577 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:53 crc kubenswrapper[4778]: I1205 16:08:53.660091 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" podStartSLOduration=9.660075776 podStartE2EDuration="9.660075776s" podCreationTimestamp="2025-12-05 16:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:08:53.659174831 +0000 UTC m=+820.762971221" watchObservedRunningTime="2025-12-05 16:08:53.660075776 +0000 UTC m=+820.763872156" Dec 05 16:08:53 crc kubenswrapper[4778]: I1205 16:08:53.666610 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:53 crc kubenswrapper[4778]: I1205 16:08:53.667127 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:08:53 crc kubenswrapper[4778]: E1205 16:08:53.760645 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9b2e4be_cc3d_4f22_87ab_b6a7718e92e2.slice/crio-bbd92b6847924366eea198b06edd162c280e21aae84bdd4c52b53a29a157a359.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.214179 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p"] Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.214299 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.214725 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.223727 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf"] Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.223846 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.224217 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.261919 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v"] Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.262061 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.262591 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.268858 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-5xcd4"] Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.268964 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.269317 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.296595 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-nvd8k"] Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.296710 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.297080 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.629882 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-phf8p_openshift-operators_7d00eb5e-6107-4c91-b9ed-540833e16404_0(0b8b562b72c27224fd7f960324ecacc029ec9ebc6386cb64a3c7f927ca958c1b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.629962 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-phf8p_openshift-operators_7d00eb5e-6107-4c91-b9ed-540833e16404_0(0b8b562b72c27224fd7f960324ecacc029ec9ebc6386cb64a3c7f927ca958c1b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.629992 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-phf8p_openshift-operators_7d00eb5e-6107-4c91-b9ed-540833e16404_0(0b8b562b72c27224fd7f960324ecacc029ec9ebc6386cb64a3c7f927ca958c1b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.630042 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-phf8p_openshift-operators(7d00eb5e-6107-4c91-b9ed-540833e16404)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-phf8p_openshift-operators(7d00eb5e-6107-4c91-b9ed-540833e16404)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-phf8p_openshift-operators_7d00eb5e-6107-4c91-b9ed-540833e16404_0(0b8b562b72c27224fd7f960324ecacc029ec9ebc6386cb64a3c7f927ca958c1b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" podUID="7d00eb5e-6107-4c91-b9ed-540833e16404" Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.637110 4778 generic.go:334] "Generic (PLEG): container finished" podID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" containerID="bbd92b6847924366eea198b06edd162c280e21aae84bdd4c52b53a29a157a359" exitCode=0 Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.637243 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkbtn" event={"ID":"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2","Type":"ContainerDied","Data":"bbd92b6847924366eea198b06edd162c280e21aae84bdd4c52b53a29a157a359"} Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.652629 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5xcd4_openshift-operators_8e816989-62eb-47ce-a33b-2f09f1d2b3c6_0(52c1bb054cfccd3fb7abeb242756ee3cf0bad040bf2ce969db3a02b372b4b7e0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.652703 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5xcd4_openshift-operators_8e816989-62eb-47ce-a33b-2f09f1d2b3c6_0(52c1bb054cfccd3fb7abeb242756ee3cf0bad040bf2ce969db3a02b372b4b7e0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.652732 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5xcd4_openshift-operators_8e816989-62eb-47ce-a33b-2f09f1d2b3c6_0(52c1bb054cfccd3fb7abeb242756ee3cf0bad040bf2ce969db3a02b372b4b7e0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.652786 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-5xcd4_openshift-operators(8e816989-62eb-47ce-a33b-2f09f1d2b3c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-5xcd4_openshift-operators(8e816989-62eb-47ce-a33b-2f09f1d2b3c6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5xcd4_openshift-operators_8e816989-62eb-47ce-a33b-2f09f1d2b3c6_0(52c1bb054cfccd3fb7abeb242756ee3cf0bad040bf2ce969db3a02b372b4b7e0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" podUID="8e816989-62eb-47ce-a33b-2f09f1d2b3c6" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.661898 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_openshift-operators_22a6d7c0-787b-4db2-b559-a15f91626619_0(5677cea0fbaaeb45c63e4bd9ac50a515890a4979f45d88b95339a39e71e822d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.661960 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_openshift-operators_22a6d7c0-787b-4db2-b559-a15f91626619_0(5677cea0fbaaeb45c63e4bd9ac50a515890a4979f45d88b95339a39e71e822d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.661987 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_openshift-operators_22a6d7c0-787b-4db2-b559-a15f91626619_0(5677cea0fbaaeb45c63e4bd9ac50a515890a4979f45d88b95339a39e71e822d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.662042 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_openshift-operators(22a6d7c0-787b-4db2-b559-a15f91626619)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_openshift-operators(22a6d7c0-787b-4db2-b559-a15f91626619)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_openshift-operators_22a6d7c0-787b-4db2-b559-a15f91626619_0(5677cea0fbaaeb45c63e4bd9ac50a515890a4979f45d88b95339a39e71e822d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" podUID="22a6d7c0-787b-4db2-b559-a15f91626619" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.666489 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nvd8k_openshift-operators_da29284c-59ce-4aed-a280-0b9b550c2c96_0(c2499b208ec14fb26cdf5cd01edbdea5d10f47efce4047de81c4b29de7c30bef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.666531 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nvd8k_openshift-operators_da29284c-59ce-4aed-a280-0b9b550c2c96_0(c2499b208ec14fb26cdf5cd01edbdea5d10f47efce4047de81c4b29de7c30bef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.666549 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nvd8k_openshift-operators_da29284c-59ce-4aed-a280-0b9b550c2c96_0(c2499b208ec14fb26cdf5cd01edbdea5d10f47efce4047de81c4b29de7c30bef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.666582 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-nvd8k_openshift-operators(da29284c-59ce-4aed-a280-0b9b550c2c96)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-nvd8k_openshift-operators(da29284c-59ce-4aed-a280-0b9b550c2c96)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nvd8k_openshift-operators_da29284c-59ce-4aed-a280-0b9b550c2c96_0(c2499b208ec14fb26cdf5cd01edbdea5d10f47efce4047de81c4b29de7c30bef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" podUID="da29284c-59ce-4aed-a280-0b9b550c2c96" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.670546 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_openshift-operators_06f22ef3-dbf4-44cd-bd3c-099d4b23c440_0(83e3a3e8e4437517628dd0ca7a934dfefa77e2ea385edd2ca82154c2bd346e5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.670577 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_openshift-operators_06f22ef3-dbf4-44cd-bd3c-099d4b23c440_0(83e3a3e8e4437517628dd0ca7a934dfefa77e2ea385edd2ca82154c2bd346e5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.670591 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_openshift-operators_06f22ef3-dbf4-44cd-bd3c-099d4b23c440_0(83e3a3e8e4437517628dd0ca7a934dfefa77e2ea385edd2ca82154c2bd346e5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:08:54 crc kubenswrapper[4778]: E1205 16:08:54.670620 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_openshift-operators(06f22ef3-dbf4-44cd-bd3c-099d4b23c440)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_openshift-operators(06f22ef3-dbf4-44cd-bd3c-099d4b23c440)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_openshift-operators_06f22ef3-dbf4-44cd-bd3c-099d4b23c440_0(83e3a3e8e4437517628dd0ca7a934dfefa77e2ea385edd2ca82154c2bd346e5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" podUID="06f22ef3-dbf4-44cd-bd3c-099d4b23c440" Dec 05 16:08:54 crc kubenswrapper[4778]: I1205 16:08:54.937696 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.107937 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7bnc\" (UniqueName: \"kubernetes.io/projected/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-kube-api-access-d7bnc\") pod \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\" (UID: \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\") " Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.108025 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-catalog-content\") pod \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\" (UID: \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\") " Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.108080 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-utilities\") pod \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\" (UID: \"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2\") " Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.108923 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-utilities" (OuterVolumeSpecName: "utilities") pod "d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" (UID: "d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.114950 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-kube-api-access-d7bnc" (OuterVolumeSpecName: "kube-api-access-d7bnc") pod "d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" (UID: "d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2"). InnerVolumeSpecName "kube-api-access-d7bnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.209350 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7bnc\" (UniqueName: \"kubernetes.io/projected/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-kube-api-access-d7bnc\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.209390 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.221535 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" (UID: "d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.309972 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.644416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkbtn" event={"ID":"d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2","Type":"ContainerDied","Data":"5604f91b7eacfece64261900466ff7d9452690074630bdaa930285ec55124a07"} Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.644435 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkbtn" Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.644511 4778 scope.go:117] "RemoveContainer" containerID="bbd92b6847924366eea198b06edd162c280e21aae84bdd4c52b53a29a157a359" Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.663341 4778 scope.go:117] "RemoveContainer" containerID="200d1621eb3a565fb18bb36f8914db9df08d32772ef4d49b9fab09a21e87e976" Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.664995 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wkbtn"] Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.677062 4778 scope.go:117] "RemoveContainer" containerID="fb596eac147585056d4914258be871aefa8ee2d72213144ea01d7402b4fdf1b0" Dec 05 16:08:55 crc kubenswrapper[4778]: I1205 16:08:55.683376 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wkbtn"] Dec 05 16:08:57 crc kubenswrapper[4778]: I1205 16:08:57.255711 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" path="/var/lib/kubelet/pods/d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2/volumes" Dec 05 16:09:06 crc kubenswrapper[4778]: I1205 16:09:06.248647 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:09:06 crc kubenswrapper[4778]: I1205 16:09:06.248646 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:09:06 crc kubenswrapper[4778]: I1205 16:09:06.249469 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" Dec 05 16:09:06 crc kubenswrapper[4778]: I1205 16:09:06.249736 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" Dec 05 16:09:06 crc kubenswrapper[4778]: I1205 16:09:06.516831 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v"] Dec 05 16:09:06 crc kubenswrapper[4778]: I1205 16:09:06.574852 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf"] Dec 05 16:09:06 crc kubenswrapper[4778]: W1205 16:09:06.598492 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06f22ef3_dbf4_44cd_bd3c_099d4b23c440.slice/crio-2733db60ca8d15ea7f9026faf943743d0e566394349800b8981f10d18df310ac WatchSource:0}: Error finding container 2733db60ca8d15ea7f9026faf943743d0e566394349800b8981f10d18df310ac: Status 404 returned error can't find the container with id 2733db60ca8d15ea7f9026faf943743d0e566394349800b8981f10d18df310ac Dec 05 16:09:06 crc kubenswrapper[4778]: I1205 16:09:06.704175 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" event={"ID":"22a6d7c0-787b-4db2-b559-a15f91626619","Type":"ContainerStarted","Data":"9e3d1a5942a6f7411a2f616afb47d762edabd445f0cb87b1d220b2a118e8cb8b"} Dec 05 16:09:06 crc kubenswrapper[4778]: I1205 16:09:06.705559 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" event={"ID":"06f22ef3-dbf4-44cd-bd3c-099d4b23c440","Type":"ContainerStarted","Data":"2733db60ca8d15ea7f9026faf943743d0e566394349800b8981f10d18df310ac"} Dec 05 16:09:07 crc kubenswrapper[4778]: I1205 16:09:07.249308 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:09:07 crc kubenswrapper[4778]: I1205 16:09:07.249395 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:09:07 crc kubenswrapper[4778]: I1205 16:09:07.250051 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" Dec 05 16:09:07 crc kubenswrapper[4778]: I1205 16:09:07.250131 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:09:07 crc kubenswrapper[4778]: I1205 16:09:07.501188 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p"] Dec 05 16:09:07 crc kubenswrapper[4778]: I1205 16:09:07.550950 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-5xcd4"] Dec 05 16:09:07 crc kubenswrapper[4778]: I1205 16:09:07.709581 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" event={"ID":"7d00eb5e-6107-4c91-b9ed-540833e16404","Type":"ContainerStarted","Data":"492a0ebe5d96954f595a14b852627fab22109cb8fb5bc611d3cec3b11cd6344c"} Dec 05 16:09:07 crc kubenswrapper[4778]: I1205 16:09:07.710667 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" event={"ID":"8e816989-62eb-47ce-a33b-2f09f1d2b3c6","Type":"ContainerStarted","Data":"39f6f1a84a09039502fad78c03eefa368bf17c42df60385ecbea0f785f73ce34"} Dec 05 16:09:10 crc kubenswrapper[4778]: I1205 16:09:10.252207 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:09:10 crc kubenswrapper[4778]: I1205 16:09:10.252635 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:09:14 crc kubenswrapper[4778]: I1205 16:09:14.834532 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bnrp8" Dec 05 16:09:24 crc kubenswrapper[4778]: E1205 16:09:24.307345 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb" Dec 05 16:09:24 crc kubenswrapper[4778]: E1205 16:09:24.308219 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:e718854a7d6ca8accf0fa72db0eb902e46c44d747ad51dc3f06bba0cefaa3c01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:d972f4faa5e9c121402d23ed85002f26af48ec36b1b71a7489d677b3913d08b4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:91531137fc1dcd740e277e0f65e120a0176a16f788c14c27925b61aa0b792ade,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:897e1bfad1187062725b54d87107bd0155972257a50d8335dd29e1999b828a4f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:95fe5b5746ca8c07ac9217ce2d8ac8e6afad17af210f9d8e0074df1310b209a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:e9d9a89e4d8126a62b1852055482258ee528cac6398dd5d43ebad75ace0f33c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:ec684a0645ceb917b019af7ddba68c3533416e356ab0d0320a30e75ca7ebb31b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:3b9693fcde9b3a9494fb04735b1f7cfd0426f10be820fdc3f024175c0d3df1c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:580606f194180accc8abba099e17a26dca7522ec6d233fa2fdd40312771703e3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:e03777be39e71701935059cd877603874a13ac94daa73219d4e5e545599d78a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:aa47256193cfd2877853878e1ae97d2ab8b8e5deae62b387cbfad02b284d379c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:c595ff56b2cb85514bf4784db6ddb82e4e657e3e708a7fb695fc4997379a94d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:45a4ec2a519bcec99e886aa91596d5356a2414a2bd103baaef9fa7838c672eb2,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sfs8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-d8bb48f5d-5xcd4_openshift-operators(8e816989-62eb-47ce-a33b-2f09f1d2b3c6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 16:09:24 crc kubenswrapper[4778]: E1205 16:09:24.309592 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" podUID="8e816989-62eb-47ce-a33b-2f09f1d2b3c6" Dec 05 16:09:24 crc kubenswrapper[4778]: I1205 16:09:24.609097 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-nvd8k"] Dec 05 16:09:24 crc kubenswrapper[4778]: W1205 16:09:24.616794 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda29284c_59ce_4aed_a280_0b9b550c2c96.slice/crio-32ae53a5512a3f2a79e9f073c712f3d7f1812705687f46294e899b8dc3e65334 WatchSource:0}: Error finding container 32ae53a5512a3f2a79e9f073c712f3d7f1812705687f46294e899b8dc3e65334: Status 404 returned error can't find the container with id 32ae53a5512a3f2a79e9f073c712f3d7f1812705687f46294e899b8dc3e65334 Dec 05 16:09:24 crc kubenswrapper[4778]: I1205 16:09:24.824980 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" event={"ID":"da29284c-59ce-4aed-a280-0b9b550c2c96","Type":"ContainerStarted","Data":"32ae53a5512a3f2a79e9f073c712f3d7f1812705687f46294e899b8dc3e65334"} Dec 05 16:09:24 crc kubenswrapper[4778]: I1205 16:09:24.827021 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" event={"ID":"7d00eb5e-6107-4c91-b9ed-540833e16404","Type":"ContainerStarted","Data":"6626ac85ffa6db1d8971b72bb844534d30750517c7ee2dcd33442bd8a63cc32a"} Dec 05 16:09:24 crc kubenswrapper[4778]: I1205 16:09:24.828631 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" event={"ID":"22a6d7c0-787b-4db2-b559-a15f91626619","Type":"ContainerStarted","Data":"c11e390638ba8ce37ca9fa272c2f5f301709ff048d55952e27c13986dcf3ac0c"} Dec 05 16:09:24 crc kubenswrapper[4778]: I1205 16:09:24.830301 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" event={"ID":"06f22ef3-dbf4-44cd-bd3c-099d4b23c440","Type":"ContainerStarted","Data":"ab64831cb11c9a80a950ca66b6f626e29bdbde2347114c6603a1ccee6cbec49b"} Dec 05 16:09:24 crc kubenswrapper[4778]: E1205 16:09:24.831415 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb\\\"\"" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" podUID="8e816989-62eb-47ce-a33b-2f09f1d2b3c6" Dec 05 16:09:24 crc kubenswrapper[4778]: I1205 16:09:24.852841 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-phf8p" podStartSLOduration=22.02276591 podStartE2EDuration="38.852827097s" podCreationTimestamp="2025-12-05 16:08:46 +0000 UTC" firstStartedPulling="2025-12-05 16:09:07.510963253 +0000 UTC m=+834.614759633" lastFinishedPulling="2025-12-05 16:09:24.34102443 +0000 UTC m=+851.444820820" observedRunningTime="2025-12-05 16:09:24.850959879 +0000 UTC m=+851.954756269" watchObservedRunningTime="2025-12-05 16:09:24.852827097 +0000 UTC m=+851.956623467" Dec 05 16:09:24 crc kubenswrapper[4778]: I1205 16:09:24.872444 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf" podStartSLOduration=21.131370319 podStartE2EDuration="38.872425023s" podCreationTimestamp="2025-12-05 16:08:46 +0000 UTC" firstStartedPulling="2025-12-05 16:09:06.599942576 +0000 UTC m=+833.703738956" lastFinishedPulling="2025-12-05 16:09:24.34099728 +0000 UTC m=+851.444793660" observedRunningTime="2025-12-05 16:09:24.868854129 +0000 UTC m=+851.972650499" watchObservedRunningTime="2025-12-05 16:09:24.872425023 +0000 UTC m=+851.976221413" Dec 05 16:09:24 crc kubenswrapper[4778]: I1205 16:09:24.919616 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v" podStartSLOduration=21.087219844 podStartE2EDuration="38.919594375s" podCreationTimestamp="2025-12-05 16:08:46 +0000 UTC" firstStartedPulling="2025-12-05 16:09:06.52843498 +0000 UTC m=+833.632231360" lastFinishedPulling="2025-12-05 16:09:24.360809511 +0000 UTC m=+851.464605891" observedRunningTime="2025-12-05 16:09:24.914428469 +0000 UTC m=+852.018224879" watchObservedRunningTime="2025-12-05 16:09:24.919594375 +0000 UTC m=+852.023390755" Dec 05 16:09:26 crc kubenswrapper[4778]: I1205 16:09:26.846473 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" event={"ID":"da29284c-59ce-4aed-a280-0b9b550c2c96","Type":"ContainerStarted","Data":"bf1b1dce42f882700462b660712c9a161cabc47ed8b19d8c132409ecb27d1f1c"} Dec 05 16:09:26 crc kubenswrapper[4778]: I1205 16:09:26.848143 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:09:26 crc kubenswrapper[4778]: I1205 16:09:26.873208 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" podStartSLOduration=38.120915257 podStartE2EDuration="39.87318139s" podCreationTimestamp="2025-12-05 16:08:47 +0000 UTC" firstStartedPulling="2025-12-05 16:09:24.618592339 +0000 UTC m=+851.722388729" lastFinishedPulling="2025-12-05 16:09:26.370858482 +0000 UTC m=+853.474654862" observedRunningTime="2025-12-05 16:09:26.865309093 +0000 UTC m=+853.969105503" watchObservedRunningTime="2025-12-05 16:09:26.87318139 +0000 UTC m=+853.976977810" Dec 05 16:09:37 crc kubenswrapper[4778]: I1205 16:09:37.382740 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-nvd8k" Dec 05 16:09:37 crc kubenswrapper[4778]: I1205 16:09:37.921670 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" event={"ID":"8e816989-62eb-47ce-a33b-2f09f1d2b3c6","Type":"ContainerStarted","Data":"30007aad7a88026645d4c2a0c8b6296b785b6b285b9c5901f7c39c44692c8b79"} Dec 05 16:09:37 crc kubenswrapper[4778]: I1205 16:09:37.922522 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:09:37 crc kubenswrapper[4778]: I1205 16:09:37.925154 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" Dec 05 16:09:37 crc kubenswrapper[4778]: I1205 16:09:37.940806 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-5xcd4" podStartSLOduration=22.448731905 podStartE2EDuration="51.940792634s" podCreationTimestamp="2025-12-05 16:08:46 +0000 UTC" firstStartedPulling="2025-12-05 16:09:07.566924015 +0000 UTC m=+834.670720395" lastFinishedPulling="2025-12-05 16:09:37.058984744 +0000 UTC m=+864.162781124" observedRunningTime="2025-12-05 16:09:37.940587749 +0000 UTC m=+865.044384129" watchObservedRunningTime="2025-12-05 16:09:37.940792634 +0000 UTC m=+865.044589014" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.409336 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx"] Dec 05 16:09:43 crc kubenswrapper[4778]: E1205 16:09:43.409878 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" containerName="extract-utilities" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.409893 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" containerName="extract-utilities" Dec 05 16:09:43 crc kubenswrapper[4778]: E1205 16:09:43.409905 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" containerName="registry-server" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.409912 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" containerName="registry-server" Dec 05 16:09:43 crc kubenswrapper[4778]: E1205 16:09:43.409934 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" containerName="extract-content" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.409942 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" containerName="extract-content" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.410066 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b2e4be-cc3d-4f22-87ab-b6a7718e92e2" containerName="registry-server" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.410985 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.415132 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.417518 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx"] Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.592270 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx\" (UID: \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.592423 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx\" (UID: \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.592448 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8xnz\" (UniqueName: \"kubernetes.io/projected/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-kube-api-access-v8xnz\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx\" (UID: \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.693168 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx\" (UID: \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.693219 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xnz\" (UniqueName: \"kubernetes.io/projected/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-kube-api-access-v8xnz\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx\" (UID: \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.693275 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx\" (UID: \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.693699 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx\" (UID: \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.693751 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx\" (UID: \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:43 crc kubenswrapper[4778]: I1205 16:09:43.728579 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8xnz\" (UniqueName: \"kubernetes.io/projected/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-kube-api-access-v8xnz\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx\" (UID: \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:44 crc kubenswrapper[4778]: I1205 16:09:44.026354 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:44 crc kubenswrapper[4778]: I1205 16:09:44.446012 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx"] Dec 05 16:09:44 crc kubenswrapper[4778]: I1205 16:09:44.969592 4778 generic.go:334] "Generic (PLEG): container finished" podID="27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" containerID="e69231dd71a2a66171e123c54aab3721190851aaee160022d1bb0d662afd7a4f" exitCode=0 Dec 05 16:09:44 crc kubenswrapper[4778]: I1205 16:09:44.969855 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" event={"ID":"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8","Type":"ContainerDied","Data":"e69231dd71a2a66171e123c54aab3721190851aaee160022d1bb0d662afd7a4f"} Dec 05 16:09:44 crc kubenswrapper[4778]: I1205 16:09:44.969932 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" event={"ID":"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8","Type":"ContainerStarted","Data":"32bb20840e36836b9ed0c27841046b43bb28c6e1f53a5db8cc39d5686f0b27f0"} Dec 05 16:09:46 crc kubenswrapper[4778]: I1205 16:09:46.985022 4778 generic.go:334] "Generic (PLEG): container finished" podID="27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" containerID="42841ccd582207460e9787c6c9e75ec0fffb5d8c86865665b3ceace3ab553b50" exitCode=0 Dec 05 16:09:46 crc kubenswrapper[4778]: I1205 16:09:46.985066 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" event={"ID":"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8","Type":"ContainerDied","Data":"42841ccd582207460e9787c6c9e75ec0fffb5d8c86865665b3ceace3ab553b50"} Dec 05 16:09:47 crc kubenswrapper[4778]: I1205 16:09:47.991955 4778 generic.go:334] "Generic (PLEG): container finished" podID="27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" containerID="8667a4caa69de62db0c9bee4bde75b083570e5a266a0eca9edff8503c941749a" exitCode=0 Dec 05 16:09:47 crc kubenswrapper[4778]: I1205 16:09:47.992007 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" event={"ID":"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8","Type":"ContainerDied","Data":"8667a4caa69de62db0c9bee4bde75b083570e5a266a0eca9edff8503c941749a"} Dec 05 16:09:49 crc kubenswrapper[4778]: I1205 16:09:49.221173 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:49 crc kubenswrapper[4778]: I1205 16:09:49.366401 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-util\") pod \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\" (UID: \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\") " Dec 05 16:09:49 crc kubenswrapper[4778]: I1205 16:09:49.366574 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8xnz\" (UniqueName: \"kubernetes.io/projected/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-kube-api-access-v8xnz\") pod \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\" (UID: \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\") " Dec 05 16:09:49 crc kubenswrapper[4778]: I1205 16:09:49.366604 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-bundle\") pod \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\" (UID: \"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8\") " Dec 05 16:09:49 crc kubenswrapper[4778]: I1205 16:09:49.367342 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-bundle" (OuterVolumeSpecName: "bundle") pod "27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" (UID: "27a8b0f6-f04b-4ce5-a429-019e09e1c6b8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:09:49 crc kubenswrapper[4778]: I1205 16:09:49.385561 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-kube-api-access-v8xnz" (OuterVolumeSpecName: "kube-api-access-v8xnz") pod "27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" (UID: "27a8b0f6-f04b-4ce5-a429-019e09e1c6b8"). InnerVolumeSpecName "kube-api-access-v8xnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:09:49 crc kubenswrapper[4778]: I1205 16:09:49.426872 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-util" (OuterVolumeSpecName: "util") pod "27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" (UID: "27a8b0f6-f04b-4ce5-a429-019e09e1c6b8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:09:49 crc kubenswrapper[4778]: I1205 16:09:49.467575 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-util\") on node \"crc\" DevicePath \"\"" Dec 05 16:09:49 crc kubenswrapper[4778]: I1205 16:09:49.467607 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8xnz\" (UniqueName: \"kubernetes.io/projected/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-kube-api-access-v8xnz\") on node \"crc\" DevicePath \"\"" Dec 05 16:09:49 crc kubenswrapper[4778]: I1205 16:09:49.467618 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27a8b0f6-f04b-4ce5-a429-019e09e1c6b8-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:09:50 crc kubenswrapper[4778]: I1205 16:09:50.031417 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" event={"ID":"27a8b0f6-f04b-4ce5-a429-019e09e1c6b8","Type":"ContainerDied","Data":"32bb20840e36836b9ed0c27841046b43bb28c6e1f53a5db8cc39d5686f0b27f0"} Dec 05 16:09:50 crc kubenswrapper[4778]: I1205 16:09:50.031507 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx" Dec 05 16:09:50 crc kubenswrapper[4778]: I1205 16:09:50.031547 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32bb20840e36836b9ed0c27841046b43bb28c6e1f53a5db8cc39d5686f0b27f0" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.156965 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xlngh"] Dec 05 16:09:52 crc kubenswrapper[4778]: E1205 16:09:52.157318 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" containerName="util" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.157343 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" containerName="util" Dec 05 16:09:52 crc kubenswrapper[4778]: E1205 16:09:52.157420 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" containerName="pull" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.157440 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" containerName="pull" Dec 05 16:09:52 crc kubenswrapper[4778]: E1205 16:09:52.157468 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" containerName="extract" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.157486 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" containerName="extract" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.157670 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a8b0f6-f04b-4ce5-a429-019e09e1c6b8" containerName="extract" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.159097 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.164319 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xlngh"] Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.301961 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snwv7\" (UniqueName: \"kubernetes.io/projected/f4d62fd2-d151-423c-9277-f5e38bc522b6-kube-api-access-snwv7\") pod \"community-operators-xlngh\" (UID: \"f4d62fd2-d151-423c-9277-f5e38bc522b6\") " pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.302319 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4d62fd2-d151-423c-9277-f5e38bc522b6-utilities\") pod \"community-operators-xlngh\" (UID: \"f4d62fd2-d151-423c-9277-f5e38bc522b6\") " pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.302356 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4d62fd2-d151-423c-9277-f5e38bc522b6-catalog-content\") pod \"community-operators-xlngh\" (UID: \"f4d62fd2-d151-423c-9277-f5e38bc522b6\") " pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.403734 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snwv7\" (UniqueName: \"kubernetes.io/projected/f4d62fd2-d151-423c-9277-f5e38bc522b6-kube-api-access-snwv7\") pod \"community-operators-xlngh\" (UID: \"f4d62fd2-d151-423c-9277-f5e38bc522b6\") " pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.403800 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4d62fd2-d151-423c-9277-f5e38bc522b6-utilities\") pod \"community-operators-xlngh\" (UID: \"f4d62fd2-d151-423c-9277-f5e38bc522b6\") " pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.403839 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4d62fd2-d151-423c-9277-f5e38bc522b6-catalog-content\") pod \"community-operators-xlngh\" (UID: \"f4d62fd2-d151-423c-9277-f5e38bc522b6\") " pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.404330 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4d62fd2-d151-423c-9277-f5e38bc522b6-catalog-content\") pod \"community-operators-xlngh\" (UID: \"f4d62fd2-d151-423c-9277-f5e38bc522b6\") " pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.404497 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4d62fd2-d151-423c-9277-f5e38bc522b6-utilities\") pod \"community-operators-xlngh\" (UID: \"f4d62fd2-d151-423c-9277-f5e38bc522b6\") " pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.438496 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snwv7\" (UniqueName: \"kubernetes.io/projected/f4d62fd2-d151-423c-9277-f5e38bc522b6-kube-api-access-snwv7\") pod \"community-operators-xlngh\" (UID: \"f4d62fd2-d151-423c-9277-f5e38bc522b6\") " pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.521525 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:09:52 crc kubenswrapper[4778]: I1205 16:09:52.763303 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xlngh"] Dec 05 16:09:53 crc kubenswrapper[4778]: I1205 16:09:53.054457 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlngh" event={"ID":"f4d62fd2-d151-423c-9277-f5e38bc522b6","Type":"ContainerStarted","Data":"320eb8df1375deead5097ef57ae0d9657a7b3c954419e2b5ad39a1b88ff679f7"} Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.060311 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4d62fd2-d151-423c-9277-f5e38bc522b6" containerID="71a20655a008ce72b3700a1539ec6f317fd835c637c44a60081e26e6d3164414" exitCode=0 Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.060351 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlngh" event={"ID":"f4d62fd2-d151-423c-9277-f5e38bc522b6","Type":"ContainerDied","Data":"71a20655a008ce72b3700a1539ec6f317fd835c637c44a60081e26e6d3164414"} Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.466843 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-qjjrk"] Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.468168 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qjjrk" Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.470234 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.470483 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-4ths8" Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.472549 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.486140 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-qjjrk"] Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.527286 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kbk7\" (UniqueName: \"kubernetes.io/projected/530e55ef-5024-4ab8-9072-709ca49ddc13-kube-api-access-2kbk7\") pod \"nmstate-operator-5b5b58f5c8-qjjrk\" (UID: \"530e55ef-5024-4ab8-9072-709ca49ddc13\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qjjrk" Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.627994 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kbk7\" (UniqueName: \"kubernetes.io/projected/530e55ef-5024-4ab8-9072-709ca49ddc13-kube-api-access-2kbk7\") pod \"nmstate-operator-5b5b58f5c8-qjjrk\" (UID: \"530e55ef-5024-4ab8-9072-709ca49ddc13\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qjjrk" Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.647186 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kbk7\" (UniqueName: \"kubernetes.io/projected/530e55ef-5024-4ab8-9072-709ca49ddc13-kube-api-access-2kbk7\") pod \"nmstate-operator-5b5b58f5c8-qjjrk\" (UID: \"530e55ef-5024-4ab8-9072-709ca49ddc13\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qjjrk" Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.785744 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qjjrk" Dec 05 16:09:54 crc kubenswrapper[4778]: I1205 16:09:54.974465 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-qjjrk"] Dec 05 16:09:55 crc kubenswrapper[4778]: I1205 16:09:55.066961 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qjjrk" event={"ID":"530e55ef-5024-4ab8-9072-709ca49ddc13","Type":"ContainerStarted","Data":"95c984272bd959b8b1cafb064a4e64adf690d7233ec009247861e16aea8677e1"} Dec 05 16:09:56 crc kubenswrapper[4778]: I1205 16:09:56.074634 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4d62fd2-d151-423c-9277-f5e38bc522b6" containerID="93c2aa9c5f863802e25403f0ef442b6e74538f673ac9cd8d74a87929b6b20be4" exitCode=0 Dec 05 16:09:56 crc kubenswrapper[4778]: I1205 16:09:56.074736 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlngh" event={"ID":"f4d62fd2-d151-423c-9277-f5e38bc522b6","Type":"ContainerDied","Data":"93c2aa9c5f863802e25403f0ef442b6e74538f673ac9cd8d74a87929b6b20be4"} Dec 05 16:09:57 crc kubenswrapper[4778]: I1205 16:09:57.085174 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlngh" event={"ID":"f4d62fd2-d151-423c-9277-f5e38bc522b6","Type":"ContainerStarted","Data":"6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e"} Dec 05 16:09:57 crc kubenswrapper[4778]: I1205 16:09:57.087450 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qjjrk" event={"ID":"530e55ef-5024-4ab8-9072-709ca49ddc13","Type":"ContainerStarted","Data":"c16a870d1f6d8dbe15dba3285d6e76970669566b17733b15015eaaaa26e9e304"} Dec 05 16:09:57 crc kubenswrapper[4778]: I1205 16:09:57.108527 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xlngh" podStartSLOduration=2.623112963 podStartE2EDuration="5.108501841s" podCreationTimestamp="2025-12-05 16:09:52 +0000 UTC" firstStartedPulling="2025-12-05 16:09:54.062230043 +0000 UTC m=+881.166026423" lastFinishedPulling="2025-12-05 16:09:56.547618921 +0000 UTC m=+883.651415301" observedRunningTime="2025-12-05 16:09:57.103212092 +0000 UTC m=+884.207008502" watchObservedRunningTime="2025-12-05 16:09:57.108501841 +0000 UTC m=+884.212298251" Dec 05 16:09:57 crc kubenswrapper[4778]: I1205 16:09:57.131023 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qjjrk" podStartSLOduration=1.567496872 podStartE2EDuration="3.131000254s" podCreationTimestamp="2025-12-05 16:09:54 +0000 UTC" firstStartedPulling="2025-12-05 16:09:54.985625749 +0000 UTC m=+882.089422129" lastFinishedPulling="2025-12-05 16:09:56.549129131 +0000 UTC m=+883.652925511" observedRunningTime="2025-12-05 16:09:57.127691816 +0000 UTC m=+884.231488206" watchObservedRunningTime="2025-12-05 16:09:57.131000254 +0000 UTC m=+884.234796644" Dec 05 16:10:02 crc kubenswrapper[4778]: I1205 16:10:02.522114 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:10:02 crc kubenswrapper[4778]: I1205 16:10:02.523678 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:10:02 crc kubenswrapper[4778]: I1205 16:10:02.593088 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:10:03 crc kubenswrapper[4778]: I1205 16:10:03.208676 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:10:03 crc kubenswrapper[4778]: I1205 16:10:03.415419 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:10:03 crc kubenswrapper[4778]: I1205 16:10:03.415506 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:10:03 crc kubenswrapper[4778]: I1205 16:10:03.526413 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xlngh"] Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.662828 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-4zkxj"] Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.663920 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zkxj" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.670038 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd"] Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.671307 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.672739 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2rv4f" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.672792 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.677654 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-4zkxj"] Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.709257 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd"] Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.734298 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mms85"] Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.735012 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.763839 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsrdn\" (UniqueName: \"kubernetes.io/projected/63e85179-39a5-418e-abe3-91a9a9a276e3-kube-api-access-wsrdn\") pod \"nmstate-metrics-7f946cbc9-4zkxj\" (UID: \"63e85179-39a5-418e-abe3-91a9a9a276e3\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zkxj" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.802111 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m"] Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.802752 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.804379 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.804759 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-zlkr2" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.805763 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.822297 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m"] Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.864453 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/585193cc-6d97-46b0-95be-7ae7fffb2d11-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-64vbd\" (UID: \"585193cc-6d97-46b0-95be-7ae7fffb2d11\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.864504 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhtkz\" (UniqueName: \"kubernetes.io/projected/63520720-3a3d-44da-b13c-7406c45a6d50-kube-api-access-rhtkz\") pod \"nmstate-handler-mms85\" (UID: \"63520720-3a3d-44da-b13c-7406c45a6d50\") " pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.864542 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/63520720-3a3d-44da-b13c-7406c45a6d50-ovs-socket\") pod \"nmstate-handler-mms85\" (UID: \"63520720-3a3d-44da-b13c-7406c45a6d50\") " pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.864566 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsrdn\" (UniqueName: \"kubernetes.io/projected/63e85179-39a5-418e-abe3-91a9a9a276e3-kube-api-access-wsrdn\") pod \"nmstate-metrics-7f946cbc9-4zkxj\" (UID: \"63e85179-39a5-418e-abe3-91a9a9a276e3\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zkxj" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.864592 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/63520720-3a3d-44da-b13c-7406c45a6d50-nmstate-lock\") pod \"nmstate-handler-mms85\" (UID: \"63520720-3a3d-44da-b13c-7406c45a6d50\") " pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.864613 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l8gx\" (UniqueName: \"kubernetes.io/projected/585193cc-6d97-46b0-95be-7ae7fffb2d11-kube-api-access-8l8gx\") pod \"nmstate-webhook-5f6d4c5ccb-64vbd\" (UID: \"585193cc-6d97-46b0-95be-7ae7fffb2d11\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.864637 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/63520720-3a3d-44da-b13c-7406c45a6d50-dbus-socket\") pod \"nmstate-handler-mms85\" (UID: \"63520720-3a3d-44da-b13c-7406c45a6d50\") " pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.882589 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsrdn\" (UniqueName: \"kubernetes.io/projected/63e85179-39a5-418e-abe3-91a9a9a276e3-kube-api-access-wsrdn\") pod \"nmstate-metrics-7f946cbc9-4zkxj\" (UID: \"63e85179-39a5-418e-abe3-91a9a9a276e3\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zkxj" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.966232 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/63520720-3a3d-44da-b13c-7406c45a6d50-ovs-socket\") pod \"nmstate-handler-mms85\" (UID: \"63520720-3a3d-44da-b13c-7406c45a6d50\") " pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.966288 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4cbfbce7-cb93-4e17-8e3a-688a04322274-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-w494m\" (UID: \"4cbfbce7-cb93-4e17-8e3a-688a04322274\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.966327 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/63520720-3a3d-44da-b13c-7406c45a6d50-nmstate-lock\") pod \"nmstate-handler-mms85\" (UID: \"63520720-3a3d-44da-b13c-7406c45a6d50\") " pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.966357 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qvtz\" (UniqueName: \"kubernetes.io/projected/4cbfbce7-cb93-4e17-8e3a-688a04322274-kube-api-access-4qvtz\") pod \"nmstate-console-plugin-7fbb5f6569-w494m\" (UID: \"4cbfbce7-cb93-4e17-8e3a-688a04322274\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.966400 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l8gx\" (UniqueName: \"kubernetes.io/projected/585193cc-6d97-46b0-95be-7ae7fffb2d11-kube-api-access-8l8gx\") pod \"nmstate-webhook-5f6d4c5ccb-64vbd\" (UID: \"585193cc-6d97-46b0-95be-7ae7fffb2d11\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.966437 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/63520720-3a3d-44da-b13c-7406c45a6d50-dbus-socket\") pod \"nmstate-handler-mms85\" (UID: \"63520720-3a3d-44da-b13c-7406c45a6d50\") " pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.966476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cbfbce7-cb93-4e17-8e3a-688a04322274-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-w494m\" (UID: \"4cbfbce7-cb93-4e17-8e3a-688a04322274\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.966503 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/585193cc-6d97-46b0-95be-7ae7fffb2d11-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-64vbd\" (UID: \"585193cc-6d97-46b0-95be-7ae7fffb2d11\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.966536 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhtkz\" (UniqueName: \"kubernetes.io/projected/63520720-3a3d-44da-b13c-7406c45a6d50-kube-api-access-rhtkz\") pod \"nmstate-handler-mms85\" (UID: \"63520720-3a3d-44da-b13c-7406c45a6d50\") " pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.966856 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/63520720-3a3d-44da-b13c-7406c45a6d50-ovs-socket\") pod \"nmstate-handler-mms85\" (UID: \"63520720-3a3d-44da-b13c-7406c45a6d50\") " pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.966911 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/63520720-3a3d-44da-b13c-7406c45a6d50-nmstate-lock\") pod \"nmstate-handler-mms85\" (UID: \"63520720-3a3d-44da-b13c-7406c45a6d50\") " pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.968239 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/63520720-3a3d-44da-b13c-7406c45a6d50-dbus-socket\") pod \"nmstate-handler-mms85\" (UID: \"63520720-3a3d-44da-b13c-7406c45a6d50\") " pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.972985 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/585193cc-6d97-46b0-95be-7ae7fffb2d11-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-64vbd\" (UID: \"585193cc-6d97-46b0-95be-7ae7fffb2d11\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.989225 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zkxj" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.995042 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-78776b8fbf-hfsq2"] Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.995973 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:04 crc kubenswrapper[4778]: I1205 16:10:04.997223 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l8gx\" (UniqueName: \"kubernetes.io/projected/585193cc-6d97-46b0-95be-7ae7fffb2d11-kube-api-access-8l8gx\") pod \"nmstate-webhook-5f6d4c5ccb-64vbd\" (UID: \"585193cc-6d97-46b0-95be-7ae7fffb2d11\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.002339 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.011875 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhtkz\" (UniqueName: \"kubernetes.io/projected/63520720-3a3d-44da-b13c-7406c45a6d50-kube-api-access-rhtkz\") pod \"nmstate-handler-mms85\" (UID: \"63520720-3a3d-44da-b13c-7406c45a6d50\") " pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.021605 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78776b8fbf-hfsq2"] Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.050706 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.067725 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cbfbce7-cb93-4e17-8e3a-688a04322274-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-w494m\" (UID: \"4cbfbce7-cb93-4e17-8e3a-688a04322274\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.067793 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4cbfbce7-cb93-4e17-8e3a-688a04322274-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-w494m\" (UID: \"4cbfbce7-cb93-4e17-8e3a-688a04322274\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.067824 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qvtz\" (UniqueName: \"kubernetes.io/projected/4cbfbce7-cb93-4e17-8e3a-688a04322274-kube-api-access-4qvtz\") pod \"nmstate-console-plugin-7fbb5f6569-w494m\" (UID: \"4cbfbce7-cb93-4e17-8e3a-688a04322274\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.069165 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4cbfbce7-cb93-4e17-8e3a-688a04322274-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-w494m\" (UID: \"4cbfbce7-cb93-4e17-8e3a-688a04322274\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.071553 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cbfbce7-cb93-4e17-8e3a-688a04322274-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-w494m\" (UID: \"4cbfbce7-cb93-4e17-8e3a-688a04322274\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.084746 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qvtz\" (UniqueName: \"kubernetes.io/projected/4cbfbce7-cb93-4e17-8e3a-688a04322274-kube-api-access-4qvtz\") pod \"nmstate-console-plugin-7fbb5f6569-w494m\" (UID: \"4cbfbce7-cb93-4e17-8e3a-688a04322274\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.115396 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.149730 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mms85" event={"ID":"63520720-3a3d-44da-b13c-7406c45a6d50","Type":"ContainerStarted","Data":"d3aa6819b36e7191dde2c9d3b4caed28bb3c56868b3c330b6b978c1c87bb8614"} Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.149834 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xlngh" podUID="f4d62fd2-d151-423c-9277-f5e38bc522b6" containerName="registry-server" containerID="cri-o://6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e" gracePeriod=2 Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.172124 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-console-config\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.172186 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khhr7\" (UniqueName: \"kubernetes.io/projected/1f5e616a-22c3-400f-829a-21846578a9d0-kube-api-access-khhr7\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.172214 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-oauth-serving-cert\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.172240 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-service-ca\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.172329 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f5e616a-22c3-400f-829a-21846578a9d0-console-serving-cert\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.172349 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f5e616a-22c3-400f-829a-21846578a9d0-console-oauth-config\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.172397 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-trusted-ca-bundle\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.220632 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd"] Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.267167 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-4zkxj"] Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.273156 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-oauth-serving-cert\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.273210 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-service-ca\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.273269 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f5e616a-22c3-400f-829a-21846578a9d0-console-serving-cert\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.273303 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f5e616a-22c3-400f-829a-21846578a9d0-console-oauth-config\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.273329 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-trusted-ca-bundle\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.273393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-console-config\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.273422 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khhr7\" (UniqueName: \"kubernetes.io/projected/1f5e616a-22c3-400f-829a-21846578a9d0-kube-api-access-khhr7\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.274471 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-oauth-serving-cert\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.274961 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-service-ca\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.276211 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-trusted-ca-bundle\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.276686 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-console-config\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.278755 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f5e616a-22c3-400f-829a-21846578a9d0-console-serving-cert\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.281044 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f5e616a-22c3-400f-829a-21846578a9d0-console-oauth-config\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.287529 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khhr7\" (UniqueName: \"kubernetes.io/projected/1f5e616a-22c3-400f-829a-21846578a9d0-kube-api-access-khhr7\") pod \"console-78776b8fbf-hfsq2\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.315972 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m"] Dec 05 16:10:05 crc kubenswrapper[4778]: W1205 16:10:05.358297 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63e85179_39a5_418e_abe3_91a9a9a276e3.slice/crio-9d07b323ce34fa75dba6d6ace94c9ae7dbac37153f818e9ffa5aac02d8b5d75a WatchSource:0}: Error finding container 9d07b323ce34fa75dba6d6ace94c9ae7dbac37153f818e9ffa5aac02d8b5d75a: Status 404 returned error can't find the container with id 9d07b323ce34fa75dba6d6ace94c9ae7dbac37153f818e9ffa5aac02d8b5d75a Dec 05 16:10:05 crc kubenswrapper[4778]: W1205 16:10:05.359238 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cbfbce7_cb93_4e17_8e3a_688a04322274.slice/crio-066e6a3133c22cb8dceb89bf7a968653f1461de18e46e490c4b7ac9e11fc4db1 WatchSource:0}: Error finding container 066e6a3133c22cb8dceb89bf7a968653f1461de18e46e490c4b7ac9e11fc4db1: Status 404 returned error can't find the container with id 066e6a3133c22cb8dceb89bf7a968653f1461de18e46e490c4b7ac9e11fc4db1 Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.369685 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:05 crc kubenswrapper[4778]: I1205 16:10:05.582761 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78776b8fbf-hfsq2"] Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.058654 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.156269 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" event={"ID":"4cbfbce7-cb93-4e17-8e3a-688a04322274","Type":"ContainerStarted","Data":"066e6a3133c22cb8dceb89bf7a968653f1461de18e46e490c4b7ac9e11fc4db1"} Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.159447 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4d62fd2-d151-423c-9277-f5e38bc522b6" containerID="6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e" exitCode=0 Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.159507 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlngh" event={"ID":"f4d62fd2-d151-423c-9277-f5e38bc522b6","Type":"ContainerDied","Data":"6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e"} Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.159525 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlngh" event={"ID":"f4d62fd2-d151-423c-9277-f5e38bc522b6","Type":"ContainerDied","Data":"320eb8df1375deead5097ef57ae0d9657a7b3c954419e2b5ad39a1b88ff679f7"} Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.159542 4778 scope.go:117] "RemoveContainer" containerID="6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.159635 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlngh" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.162426 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78776b8fbf-hfsq2" event={"ID":"1f5e616a-22c3-400f-829a-21846578a9d0","Type":"ContainerStarted","Data":"39df04f83ed5475f698c3ffc8cf4840b4f8831615c8b25b6129b11eeadc26769"} Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.162470 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78776b8fbf-hfsq2" event={"ID":"1f5e616a-22c3-400f-829a-21846578a9d0","Type":"ContainerStarted","Data":"38365e34566293d78897bcd2ed14e6b1c4865d36d675cfb5415608f0698e75c4"} Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.164381 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zkxj" event={"ID":"63e85179-39a5-418e-abe3-91a9a9a276e3","Type":"ContainerStarted","Data":"9d07b323ce34fa75dba6d6ace94c9ae7dbac37153f818e9ffa5aac02d8b5d75a"} Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.165795 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" event={"ID":"585193cc-6d97-46b0-95be-7ae7fffb2d11","Type":"ContainerStarted","Data":"9fe9b7cce33158ee439715fb0886e35f7806e86c2f77ed14580b1d19c9bff124"} Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.180742 4778 scope.go:117] "RemoveContainer" containerID="93c2aa9c5f863802e25403f0ef442b6e74538f673ac9cd8d74a87929b6b20be4" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.184978 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78776b8fbf-hfsq2" podStartSLOduration=2.184962793 podStartE2EDuration="2.184962793s" podCreationTimestamp="2025-12-05 16:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:10:06.183069983 +0000 UTC m=+893.286866423" watchObservedRunningTime="2025-12-05 16:10:06.184962793 +0000 UTC m=+893.288759173" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.199160 4778 scope.go:117] "RemoveContainer" containerID="71a20655a008ce72b3700a1539ec6f317fd835c637c44a60081e26e6d3164414" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.201895 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snwv7\" (UniqueName: \"kubernetes.io/projected/f4d62fd2-d151-423c-9277-f5e38bc522b6-kube-api-access-snwv7\") pod \"f4d62fd2-d151-423c-9277-f5e38bc522b6\" (UID: \"f4d62fd2-d151-423c-9277-f5e38bc522b6\") " Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.201961 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4d62fd2-d151-423c-9277-f5e38bc522b6-catalog-content\") pod \"f4d62fd2-d151-423c-9277-f5e38bc522b6\" (UID: \"f4d62fd2-d151-423c-9277-f5e38bc522b6\") " Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.202441 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4d62fd2-d151-423c-9277-f5e38bc522b6-utilities\") pod \"f4d62fd2-d151-423c-9277-f5e38bc522b6\" (UID: \"f4d62fd2-d151-423c-9277-f5e38bc522b6\") " Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.204180 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4d62fd2-d151-423c-9277-f5e38bc522b6-utilities" (OuterVolumeSpecName: "utilities") pod "f4d62fd2-d151-423c-9277-f5e38bc522b6" (UID: "f4d62fd2-d151-423c-9277-f5e38bc522b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.208195 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4d62fd2-d151-423c-9277-f5e38bc522b6-kube-api-access-snwv7" (OuterVolumeSpecName: "kube-api-access-snwv7") pod "f4d62fd2-d151-423c-9277-f5e38bc522b6" (UID: "f4d62fd2-d151-423c-9277-f5e38bc522b6"). InnerVolumeSpecName "kube-api-access-snwv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.217463 4778 scope.go:117] "RemoveContainer" containerID="6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e" Dec 05 16:10:06 crc kubenswrapper[4778]: E1205 16:10:06.218792 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e\": container with ID starting with 6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e not found: ID does not exist" containerID="6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.218828 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e"} err="failed to get container status \"6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e\": rpc error: code = NotFound desc = could not find container \"6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e\": container with ID starting with 6ad5c1e005fe55141b984f89cf9baa826b1c06183d8f5267999c6a6ad5f01d3e not found: ID does not exist" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.218852 4778 scope.go:117] "RemoveContainer" containerID="93c2aa9c5f863802e25403f0ef442b6e74538f673ac9cd8d74a87929b6b20be4" Dec 05 16:10:06 crc kubenswrapper[4778]: E1205 16:10:06.219193 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c2aa9c5f863802e25403f0ef442b6e74538f673ac9cd8d74a87929b6b20be4\": container with ID starting with 93c2aa9c5f863802e25403f0ef442b6e74538f673ac9cd8d74a87929b6b20be4 not found: ID does not exist" containerID="93c2aa9c5f863802e25403f0ef442b6e74538f673ac9cd8d74a87929b6b20be4" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.219217 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c2aa9c5f863802e25403f0ef442b6e74538f673ac9cd8d74a87929b6b20be4"} err="failed to get container status \"93c2aa9c5f863802e25403f0ef442b6e74538f673ac9cd8d74a87929b6b20be4\": rpc error: code = NotFound desc = could not find container \"93c2aa9c5f863802e25403f0ef442b6e74538f673ac9cd8d74a87929b6b20be4\": container with ID starting with 93c2aa9c5f863802e25403f0ef442b6e74538f673ac9cd8d74a87929b6b20be4 not found: ID does not exist" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.219234 4778 scope.go:117] "RemoveContainer" containerID="71a20655a008ce72b3700a1539ec6f317fd835c637c44a60081e26e6d3164414" Dec 05 16:10:06 crc kubenswrapper[4778]: E1205 16:10:06.219488 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a20655a008ce72b3700a1539ec6f317fd835c637c44a60081e26e6d3164414\": container with ID starting with 71a20655a008ce72b3700a1539ec6f317fd835c637c44a60081e26e6d3164414 not found: ID does not exist" containerID="71a20655a008ce72b3700a1539ec6f317fd835c637c44a60081e26e6d3164414" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.219512 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a20655a008ce72b3700a1539ec6f317fd835c637c44a60081e26e6d3164414"} err="failed to get container status \"71a20655a008ce72b3700a1539ec6f317fd835c637c44a60081e26e6d3164414\": rpc error: code = NotFound desc = could not find container \"71a20655a008ce72b3700a1539ec6f317fd835c637c44a60081e26e6d3164414\": container with ID starting with 71a20655a008ce72b3700a1539ec6f317fd835c637c44a60081e26e6d3164414 not found: ID does not exist" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.262460 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4d62fd2-d151-423c-9277-f5e38bc522b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4d62fd2-d151-423c-9277-f5e38bc522b6" (UID: "f4d62fd2-d151-423c-9277-f5e38bc522b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.304519 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snwv7\" (UniqueName: \"kubernetes.io/projected/f4d62fd2-d151-423c-9277-f5e38bc522b6-kube-api-access-snwv7\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.304929 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4d62fd2-d151-423c-9277-f5e38bc522b6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.305016 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4d62fd2-d151-423c-9277-f5e38bc522b6-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.497414 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xlngh"] Dec 05 16:10:06 crc kubenswrapper[4778]: I1205 16:10:06.501992 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xlngh"] Dec 05 16:10:07 crc kubenswrapper[4778]: I1205 16:10:07.256714 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4d62fd2-d151-423c-9277-f5e38bc522b6" path="/var/lib/kubelet/pods/f4d62fd2-d151-423c-9277-f5e38bc522b6/volumes" Dec 05 16:10:08 crc kubenswrapper[4778]: I1205 16:10:08.183685 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zkxj" event={"ID":"63e85179-39a5-418e-abe3-91a9a9a276e3","Type":"ContainerStarted","Data":"39906c1e70c0b08ffa3992f94e6d45c34258fb10bf0b7900132bdbcfeb32fd5f"} Dec 05 16:10:08 crc kubenswrapper[4778]: I1205 16:10:08.186486 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" event={"ID":"585193cc-6d97-46b0-95be-7ae7fffb2d11","Type":"ContainerStarted","Data":"91f8dfbb12f9bcd8f5be4a0d59aa00c5583db7723bd714d7f7f8f4176225da44"} Dec 05 16:10:08 crc kubenswrapper[4778]: I1205 16:10:08.186634 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" Dec 05 16:10:08 crc kubenswrapper[4778]: I1205 16:10:08.189619 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" event={"ID":"4cbfbce7-cb93-4e17-8e3a-688a04322274","Type":"ContainerStarted","Data":"0acefd7b146f8ce79c748bf7df7405a3b00d4aee09aeefd968d5a91c21d6e35c"} Dec 05 16:10:08 crc kubenswrapper[4778]: I1205 16:10:08.191970 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mms85" event={"ID":"63520720-3a3d-44da-b13c-7406c45a6d50","Type":"ContainerStarted","Data":"f3fdf52de01e1231c27c5e5b1a6ee352c2391c33ad223b29772f3eef44115cc3"} Dec 05 16:10:08 crc kubenswrapper[4778]: I1205 16:10:08.192213 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:08 crc kubenswrapper[4778]: I1205 16:10:08.224627 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" podStartSLOduration=1.6698233070000001 podStartE2EDuration="4.224606843s" podCreationTimestamp="2025-12-05 16:10:04 +0000 UTC" firstStartedPulling="2025-12-05 16:10:05.235168531 +0000 UTC m=+892.338964911" lastFinishedPulling="2025-12-05 16:10:07.789952047 +0000 UTC m=+894.893748447" observedRunningTime="2025-12-05 16:10:08.222857307 +0000 UTC m=+895.326653727" watchObservedRunningTime="2025-12-05 16:10:08.224606843 +0000 UTC m=+895.328403233" Dec 05 16:10:08 crc kubenswrapper[4778]: I1205 16:10:08.316258 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mms85" podStartSLOduration=1.613673619 podStartE2EDuration="4.316232166s" podCreationTimestamp="2025-12-05 16:10:04 +0000 UTC" firstStartedPulling="2025-12-05 16:10:05.071711467 +0000 UTC m=+892.175507847" lastFinishedPulling="2025-12-05 16:10:07.774270004 +0000 UTC m=+894.878066394" observedRunningTime="2025-12-05 16:10:08.315774824 +0000 UTC m=+895.419571214" watchObservedRunningTime="2025-12-05 16:10:08.316232166 +0000 UTC m=+895.420028566" Dec 05 16:10:08 crc kubenswrapper[4778]: I1205 16:10:08.329148 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w494m" podStartSLOduration=1.9498601610000001 podStartE2EDuration="4.329127745s" podCreationTimestamp="2025-12-05 16:10:04 +0000 UTC" firstStartedPulling="2025-12-05 16:10:05.361038326 +0000 UTC m=+892.464834706" lastFinishedPulling="2025-12-05 16:10:07.74030591 +0000 UTC m=+894.844102290" observedRunningTime="2025-12-05 16:10:08.288236418 +0000 UTC m=+895.392032798" watchObservedRunningTime="2025-12-05 16:10:08.329127745 +0000 UTC m=+895.432924125" Dec 05 16:10:10 crc kubenswrapper[4778]: I1205 16:10:10.204287 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zkxj" event={"ID":"63e85179-39a5-418e-abe3-91a9a9a276e3","Type":"ContainerStarted","Data":"d6f9b0e63f18b46e79e843f28b4b74b904ba17517db68dc0e58e77b67e8ba383"} Dec 05 16:10:10 crc kubenswrapper[4778]: I1205 16:10:10.222836 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zkxj" podStartSLOduration=1.61108647 podStartE2EDuration="6.222810522s" podCreationTimestamp="2025-12-05 16:10:04 +0000 UTC" firstStartedPulling="2025-12-05 16:10:05.36083001 +0000 UTC m=+892.464626390" lastFinishedPulling="2025-12-05 16:10:09.972554062 +0000 UTC m=+897.076350442" observedRunningTime="2025-12-05 16:10:10.221417085 +0000 UTC m=+897.325213505" watchObservedRunningTime="2025-12-05 16:10:10.222810522 +0000 UTC m=+897.326606922" Dec 05 16:10:15 crc kubenswrapper[4778]: I1205 16:10:15.086441 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mms85" Dec 05 16:10:15 crc kubenswrapper[4778]: I1205 16:10:15.370812 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:15 crc kubenswrapper[4778]: I1205 16:10:15.370868 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:15 crc kubenswrapper[4778]: I1205 16:10:15.378782 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:16 crc kubenswrapper[4778]: I1205 16:10:16.268739 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:10:16 crc kubenswrapper[4778]: I1205 16:10:16.340884 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-gnnls"] Dec 05 16:10:25 crc kubenswrapper[4778]: I1205 16:10:25.012906 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-64vbd" Dec 05 16:10:33 crc kubenswrapper[4778]: I1205 16:10:33.414544 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:10:33 crc kubenswrapper[4778]: I1205 16:10:33.415061 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:10:39 crc kubenswrapper[4778]: I1205 16:10:39.851015 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw"] Dec 05 16:10:39 crc kubenswrapper[4778]: E1205 16:10:39.851789 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d62fd2-d151-423c-9277-f5e38bc522b6" containerName="extract-content" Dec 05 16:10:39 crc kubenswrapper[4778]: I1205 16:10:39.851803 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d62fd2-d151-423c-9277-f5e38bc522b6" containerName="extract-content" Dec 05 16:10:39 crc kubenswrapper[4778]: E1205 16:10:39.851816 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d62fd2-d151-423c-9277-f5e38bc522b6" containerName="extract-utilities" Dec 05 16:10:39 crc kubenswrapper[4778]: I1205 16:10:39.851822 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d62fd2-d151-423c-9277-f5e38bc522b6" containerName="extract-utilities" Dec 05 16:10:39 crc kubenswrapper[4778]: E1205 16:10:39.851843 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d62fd2-d151-423c-9277-f5e38bc522b6" containerName="registry-server" Dec 05 16:10:39 crc kubenswrapper[4778]: I1205 16:10:39.851851 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d62fd2-d151-423c-9277-f5e38bc522b6" containerName="registry-server" Dec 05 16:10:39 crc kubenswrapper[4778]: I1205 16:10:39.851979 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4d62fd2-d151-423c-9277-f5e38bc522b6" containerName="registry-server" Dec 05 16:10:39 crc kubenswrapper[4778]: I1205 16:10:39.852874 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:39 crc kubenswrapper[4778]: I1205 16:10:39.854626 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 16:10:39 crc kubenswrapper[4778]: I1205 16:10:39.866626 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw"] Dec 05 16:10:39 crc kubenswrapper[4778]: I1205 16:10:39.960480 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b44f9629-0529-4a0b-bb95-5690c110cc51-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw\" (UID: \"b44f9629-0529-4a0b-bb95-5690c110cc51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:39 crc kubenswrapper[4778]: I1205 16:10:39.960567 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg4k7\" (UniqueName: \"kubernetes.io/projected/b44f9629-0529-4a0b-bb95-5690c110cc51-kube-api-access-hg4k7\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw\" (UID: \"b44f9629-0529-4a0b-bb95-5690c110cc51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:39 crc kubenswrapper[4778]: I1205 16:10:39.960595 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b44f9629-0529-4a0b-bb95-5690c110cc51-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw\" (UID: \"b44f9629-0529-4a0b-bb95-5690c110cc51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:40 crc kubenswrapper[4778]: I1205 16:10:40.062565 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg4k7\" (UniqueName: \"kubernetes.io/projected/b44f9629-0529-4a0b-bb95-5690c110cc51-kube-api-access-hg4k7\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw\" (UID: \"b44f9629-0529-4a0b-bb95-5690c110cc51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:40 crc kubenswrapper[4778]: I1205 16:10:40.062644 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b44f9629-0529-4a0b-bb95-5690c110cc51-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw\" (UID: \"b44f9629-0529-4a0b-bb95-5690c110cc51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:40 crc kubenswrapper[4778]: I1205 16:10:40.062734 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b44f9629-0529-4a0b-bb95-5690c110cc51-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw\" (UID: \"b44f9629-0529-4a0b-bb95-5690c110cc51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:40 crc kubenswrapper[4778]: I1205 16:10:40.063472 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b44f9629-0529-4a0b-bb95-5690c110cc51-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw\" (UID: \"b44f9629-0529-4a0b-bb95-5690c110cc51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:40 crc kubenswrapper[4778]: I1205 16:10:40.063724 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b44f9629-0529-4a0b-bb95-5690c110cc51-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw\" (UID: \"b44f9629-0529-4a0b-bb95-5690c110cc51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:40 crc kubenswrapper[4778]: I1205 16:10:40.088354 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg4k7\" (UniqueName: \"kubernetes.io/projected/b44f9629-0529-4a0b-bb95-5690c110cc51-kube-api-access-hg4k7\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw\" (UID: \"b44f9629-0529-4a0b-bb95-5690c110cc51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:40 crc kubenswrapper[4778]: I1205 16:10:40.173473 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:40 crc kubenswrapper[4778]: I1205 16:10:40.413073 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw"] Dec 05 16:10:41 crc kubenswrapper[4778]: I1205 16:10:41.421819 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" event={"ID":"b44f9629-0529-4a0b-bb95-5690c110cc51","Type":"ContainerStarted","Data":"def278e2fee9e51b3caed48e495b87c60eb8b87b4d010b737dffc580a93eb353"} Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.322696 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-gnnls" podUID="0c134aff-5bc5-4901-8746-5f79fb395b01" containerName="console" containerID="cri-o://d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6" gracePeriod=15 Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.430525 4778 generic.go:334] "Generic (PLEG): container finished" podID="b44f9629-0529-4a0b-bb95-5690c110cc51" containerID="eabf7e3e445221e0665ccce76cba7f634bbed408cdb2a278df7808de1bbee36a" exitCode=0 Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.430582 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" event={"ID":"b44f9629-0529-4a0b-bb95-5690c110cc51","Type":"ContainerDied","Data":"eabf7e3e445221e0665ccce76cba7f634bbed408cdb2a278df7808de1bbee36a"} Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.660480 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-gnnls_0c134aff-5bc5-4901-8746-5f79fb395b01/console/0.log" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.660706 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gnnls" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.799265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-service-ca\") pod \"0c134aff-5bc5-4901-8746-5f79fb395b01\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.799375 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-oauth-serving-cert\") pod \"0c134aff-5bc5-4901-8746-5f79fb395b01\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.799430 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c134aff-5bc5-4901-8746-5f79fb395b01-console-serving-cert\") pod \"0c134aff-5bc5-4901-8746-5f79fb395b01\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.799455 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhh9b\" (UniqueName: \"kubernetes.io/projected/0c134aff-5bc5-4901-8746-5f79fb395b01-kube-api-access-bhh9b\") pod \"0c134aff-5bc5-4901-8746-5f79fb395b01\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.799489 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-console-config\") pod \"0c134aff-5bc5-4901-8746-5f79fb395b01\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.799517 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-trusted-ca-bundle\") pod \"0c134aff-5bc5-4901-8746-5f79fb395b01\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.799589 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c134aff-5bc5-4901-8746-5f79fb395b01-console-oauth-config\") pod \"0c134aff-5bc5-4901-8746-5f79fb395b01\" (UID: \"0c134aff-5bc5-4901-8746-5f79fb395b01\") " Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.800062 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-console-config" (OuterVolumeSpecName: "console-config") pod "0c134aff-5bc5-4901-8746-5f79fb395b01" (UID: "0c134aff-5bc5-4901-8746-5f79fb395b01"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.800085 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0c134aff-5bc5-4901-8746-5f79fb395b01" (UID: "0c134aff-5bc5-4901-8746-5f79fb395b01"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.800055 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-service-ca" (OuterVolumeSpecName: "service-ca") pod "0c134aff-5bc5-4901-8746-5f79fb395b01" (UID: "0c134aff-5bc5-4901-8746-5f79fb395b01"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.800516 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0c134aff-5bc5-4901-8746-5f79fb395b01" (UID: "0c134aff-5bc5-4901-8746-5f79fb395b01"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.805313 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c134aff-5bc5-4901-8746-5f79fb395b01-kube-api-access-bhh9b" (OuterVolumeSpecName: "kube-api-access-bhh9b") pod "0c134aff-5bc5-4901-8746-5f79fb395b01" (UID: "0c134aff-5bc5-4901-8746-5f79fb395b01"). InnerVolumeSpecName "kube-api-access-bhh9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.806597 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c134aff-5bc5-4901-8746-5f79fb395b01-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0c134aff-5bc5-4901-8746-5f79fb395b01" (UID: "0c134aff-5bc5-4901-8746-5f79fb395b01"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.811780 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c134aff-5bc5-4901-8746-5f79fb395b01-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0c134aff-5bc5-4901-8746-5f79fb395b01" (UID: "0c134aff-5bc5-4901-8746-5f79fb395b01"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.900641 4778 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c134aff-5bc5-4901-8746-5f79fb395b01-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.900672 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhh9b\" (UniqueName: \"kubernetes.io/projected/0c134aff-5bc5-4901-8746-5f79fb395b01-kube-api-access-bhh9b\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.900683 4778 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.900691 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.900700 4778 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c134aff-5bc5-4901-8746-5f79fb395b01-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.900709 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:42 crc kubenswrapper[4778]: I1205 16:10:42.900717 4778 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c134aff-5bc5-4901-8746-5f79fb395b01-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:43 crc kubenswrapper[4778]: I1205 16:10:43.438869 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-gnnls_0c134aff-5bc5-4901-8746-5f79fb395b01/console/0.log" Dec 05 16:10:43 crc kubenswrapper[4778]: I1205 16:10:43.439135 4778 generic.go:334] "Generic (PLEG): container finished" podID="0c134aff-5bc5-4901-8746-5f79fb395b01" containerID="d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6" exitCode=2 Dec 05 16:10:43 crc kubenswrapper[4778]: I1205 16:10:43.439167 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gnnls" event={"ID":"0c134aff-5bc5-4901-8746-5f79fb395b01","Type":"ContainerDied","Data":"d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6"} Dec 05 16:10:43 crc kubenswrapper[4778]: I1205 16:10:43.439193 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gnnls" event={"ID":"0c134aff-5bc5-4901-8746-5f79fb395b01","Type":"ContainerDied","Data":"c0e8a92fd0d7c151ef97ecd39512369d823cc097223fcabd5447111ed66e61d2"} Dec 05 16:10:43 crc kubenswrapper[4778]: I1205 16:10:43.439209 4778 scope.go:117] "RemoveContainer" containerID="d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6" Dec 05 16:10:43 crc kubenswrapper[4778]: I1205 16:10:43.439250 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gnnls" Dec 05 16:10:43 crc kubenswrapper[4778]: I1205 16:10:43.473089 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-gnnls"] Dec 05 16:10:43 crc kubenswrapper[4778]: I1205 16:10:43.476387 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-gnnls"] Dec 05 16:10:43 crc kubenswrapper[4778]: I1205 16:10:43.480423 4778 scope.go:117] "RemoveContainer" containerID="d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6" Dec 05 16:10:43 crc kubenswrapper[4778]: E1205 16:10:43.480865 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6\": container with ID starting with d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6 not found: ID does not exist" containerID="d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6" Dec 05 16:10:43 crc kubenswrapper[4778]: I1205 16:10:43.480897 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6"} err="failed to get container status \"d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6\": rpc error: code = NotFound desc = could not find container \"d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6\": container with ID starting with d5f220b4dc43e9cf69e60c1a0439cd5933b7dbef502ee6e2ffdc53c44af75dd6 not found: ID does not exist" Dec 05 16:10:44 crc kubenswrapper[4778]: I1205 16:10:44.511703 4778 generic.go:334] "Generic (PLEG): container finished" podID="b44f9629-0529-4a0b-bb95-5690c110cc51" containerID="8ade380d28f6a1eeab20aba5e430968a48c7c9907908cf693dc5cb311265f2d3" exitCode=0 Dec 05 16:10:44 crc kubenswrapper[4778]: I1205 16:10:44.512534 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" event={"ID":"b44f9629-0529-4a0b-bb95-5690c110cc51","Type":"ContainerDied","Data":"8ade380d28f6a1eeab20aba5e430968a48c7c9907908cf693dc5cb311265f2d3"} Dec 05 16:10:45 crc kubenswrapper[4778]: I1205 16:10:45.262078 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c134aff-5bc5-4901-8746-5f79fb395b01" path="/var/lib/kubelet/pods/0c134aff-5bc5-4901-8746-5f79fb395b01/volumes" Dec 05 16:10:45 crc kubenswrapper[4778]: I1205 16:10:45.524230 4778 generic.go:334] "Generic (PLEG): container finished" podID="b44f9629-0529-4a0b-bb95-5690c110cc51" containerID="f7dfcdd086505f0a2ce901c42572a02118188eaecf5ae79e389050618465d4e6" exitCode=0 Dec 05 16:10:45 crc kubenswrapper[4778]: I1205 16:10:45.524301 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" event={"ID":"b44f9629-0529-4a0b-bb95-5690c110cc51","Type":"ContainerDied","Data":"f7dfcdd086505f0a2ce901c42572a02118188eaecf5ae79e389050618465d4e6"} Dec 05 16:10:46 crc kubenswrapper[4778]: I1205 16:10:46.855115 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:47 crc kubenswrapper[4778]: I1205 16:10:47.035221 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg4k7\" (UniqueName: \"kubernetes.io/projected/b44f9629-0529-4a0b-bb95-5690c110cc51-kube-api-access-hg4k7\") pod \"b44f9629-0529-4a0b-bb95-5690c110cc51\" (UID: \"b44f9629-0529-4a0b-bb95-5690c110cc51\") " Dec 05 16:10:47 crc kubenswrapper[4778]: I1205 16:10:47.035415 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b44f9629-0529-4a0b-bb95-5690c110cc51-util\") pod \"b44f9629-0529-4a0b-bb95-5690c110cc51\" (UID: \"b44f9629-0529-4a0b-bb95-5690c110cc51\") " Dec 05 16:10:47 crc kubenswrapper[4778]: I1205 16:10:47.035451 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b44f9629-0529-4a0b-bb95-5690c110cc51-bundle\") pod \"b44f9629-0529-4a0b-bb95-5690c110cc51\" (UID: \"b44f9629-0529-4a0b-bb95-5690c110cc51\") " Dec 05 16:10:47 crc kubenswrapper[4778]: I1205 16:10:47.036751 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b44f9629-0529-4a0b-bb95-5690c110cc51-bundle" (OuterVolumeSpecName: "bundle") pod "b44f9629-0529-4a0b-bb95-5690c110cc51" (UID: "b44f9629-0529-4a0b-bb95-5690c110cc51"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:10:47 crc kubenswrapper[4778]: I1205 16:10:47.040820 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b44f9629-0529-4a0b-bb95-5690c110cc51-kube-api-access-hg4k7" (OuterVolumeSpecName: "kube-api-access-hg4k7") pod "b44f9629-0529-4a0b-bb95-5690c110cc51" (UID: "b44f9629-0529-4a0b-bb95-5690c110cc51"). InnerVolumeSpecName "kube-api-access-hg4k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:10:47 crc kubenswrapper[4778]: I1205 16:10:47.055495 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b44f9629-0529-4a0b-bb95-5690c110cc51-util" (OuterVolumeSpecName: "util") pod "b44f9629-0529-4a0b-bb95-5690c110cc51" (UID: "b44f9629-0529-4a0b-bb95-5690c110cc51"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:10:47 crc kubenswrapper[4778]: I1205 16:10:47.137342 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg4k7\" (UniqueName: \"kubernetes.io/projected/b44f9629-0529-4a0b-bb95-5690c110cc51-kube-api-access-hg4k7\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:47 crc kubenswrapper[4778]: I1205 16:10:47.137404 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b44f9629-0529-4a0b-bb95-5690c110cc51-util\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:47 crc kubenswrapper[4778]: I1205 16:10:47.137421 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b44f9629-0529-4a0b-bb95-5690c110cc51-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:10:47 crc kubenswrapper[4778]: I1205 16:10:47.542479 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" event={"ID":"b44f9629-0529-4a0b-bb95-5690c110cc51","Type":"ContainerDied","Data":"def278e2fee9e51b3caed48e495b87c60eb8b87b4d010b737dffc580a93eb353"} Dec 05 16:10:47 crc kubenswrapper[4778]: I1205 16:10:47.542541 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="def278e2fee9e51b3caed48e495b87c60eb8b87b4d010b737dffc580a93eb353" Dec 05 16:10:47 crc kubenswrapper[4778]: I1205 16:10:47.542623 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.096236 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr"] Dec 05 16:10:56 crc kubenswrapper[4778]: E1205 16:10:56.097034 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b44f9629-0529-4a0b-bb95-5690c110cc51" containerName="util" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.097048 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b44f9629-0529-4a0b-bb95-5690c110cc51" containerName="util" Dec 05 16:10:56 crc kubenswrapper[4778]: E1205 16:10:56.097069 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c134aff-5bc5-4901-8746-5f79fb395b01" containerName="console" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.097076 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c134aff-5bc5-4901-8746-5f79fb395b01" containerName="console" Dec 05 16:10:56 crc kubenswrapper[4778]: E1205 16:10:56.097088 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b44f9629-0529-4a0b-bb95-5690c110cc51" containerName="pull" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.097097 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b44f9629-0529-4a0b-bb95-5690c110cc51" containerName="pull" Dec 05 16:10:56 crc kubenswrapper[4778]: E1205 16:10:56.097111 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b44f9629-0529-4a0b-bb95-5690c110cc51" containerName="extract" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.097117 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b44f9629-0529-4a0b-bb95-5690c110cc51" containerName="extract" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.097224 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c134aff-5bc5-4901-8746-5f79fb395b01" containerName="console" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.097236 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b44f9629-0529-4a0b-bb95-5690c110cc51" containerName="extract" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.097732 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.101485 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.101591 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.101767 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.101885 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.101980 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-sht78" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.108478 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr"] Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.251975 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5f9a9c1-e0f5-4b56-b66e-c70686a83d57-webhook-cert\") pod \"metallb-operator-controller-manager-78b6b5f7d4-wmnmr\" (UID: \"e5f9a9c1-e0f5-4b56-b66e-c70686a83d57\") " pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.252067 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5f9a9c1-e0f5-4b56-b66e-c70686a83d57-apiservice-cert\") pod \"metallb-operator-controller-manager-78b6b5f7d4-wmnmr\" (UID: \"e5f9a9c1-e0f5-4b56-b66e-c70686a83d57\") " pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.252125 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm5t9\" (UniqueName: \"kubernetes.io/projected/e5f9a9c1-e0f5-4b56-b66e-c70686a83d57-kube-api-access-hm5t9\") pod \"metallb-operator-controller-manager-78b6b5f7d4-wmnmr\" (UID: \"e5f9a9c1-e0f5-4b56-b66e-c70686a83d57\") " pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.353033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm5t9\" (UniqueName: \"kubernetes.io/projected/e5f9a9c1-e0f5-4b56-b66e-c70686a83d57-kube-api-access-hm5t9\") pod \"metallb-operator-controller-manager-78b6b5f7d4-wmnmr\" (UID: \"e5f9a9c1-e0f5-4b56-b66e-c70686a83d57\") " pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.353148 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5f9a9c1-e0f5-4b56-b66e-c70686a83d57-webhook-cert\") pod \"metallb-operator-controller-manager-78b6b5f7d4-wmnmr\" (UID: \"e5f9a9c1-e0f5-4b56-b66e-c70686a83d57\") " pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.353205 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5f9a9c1-e0f5-4b56-b66e-c70686a83d57-apiservice-cert\") pod \"metallb-operator-controller-manager-78b6b5f7d4-wmnmr\" (UID: \"e5f9a9c1-e0f5-4b56-b66e-c70686a83d57\") " pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.360266 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5f9a9c1-e0f5-4b56-b66e-c70686a83d57-webhook-cert\") pod \"metallb-operator-controller-manager-78b6b5f7d4-wmnmr\" (UID: \"e5f9a9c1-e0f5-4b56-b66e-c70686a83d57\") " pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.374331 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5f9a9c1-e0f5-4b56-b66e-c70686a83d57-apiservice-cert\") pod \"metallb-operator-controller-manager-78b6b5f7d4-wmnmr\" (UID: \"e5f9a9c1-e0f5-4b56-b66e-c70686a83d57\") " pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.375815 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm5t9\" (UniqueName: \"kubernetes.io/projected/e5f9a9c1-e0f5-4b56-b66e-c70686a83d57-kube-api-access-hm5t9\") pod \"metallb-operator-controller-manager-78b6b5f7d4-wmnmr\" (UID: \"e5f9a9c1-e0f5-4b56-b66e-c70686a83d57\") " pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.415418 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.539929 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk"] Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.540880 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:10:56 crc kubenswrapper[4778]: W1205 16:10:56.543471 4778 reflector.go:561] object-"metallb-system"/"metallb-operator-webhook-server-service-cert": failed to list *v1.Secret: secrets "metallb-operator-webhook-server-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Dec 05 16:10:56 crc kubenswrapper[4778]: E1205 16:10:56.543540 4778 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-webhook-server-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-webhook-server-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 16:10:56 crc kubenswrapper[4778]: W1205 16:10:56.543487 4778 reflector.go:561] object-"metallb-system"/"metallb-webhook-cert": failed to list *v1.Secret: secrets "metallb-webhook-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Dec 05 16:10:56 crc kubenswrapper[4778]: E1205 16:10:56.543591 4778 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-webhook-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-webhook-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 16:10:56 crc kubenswrapper[4778]: W1205 16:10:56.548881 4778 reflector.go:561] object-"metallb-system"/"controller-dockercfg-gwp45": failed to list *v1.Secret: secrets "controller-dockercfg-gwp45" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Dec 05 16:10:56 crc kubenswrapper[4778]: E1205 16:10:56.548924 4778 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"controller-dockercfg-gwp45\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"controller-dockercfg-gwp45\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.569821 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk"] Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.660055 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/54d90db9-7dd5-4012-9dd8-1fd612a4db38-webhook-cert\") pod \"metallb-operator-webhook-server-695c4fbf6f-4nkqk\" (UID: \"54d90db9-7dd5-4012-9dd8-1fd612a4db38\") " pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.660137 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckfll\" (UniqueName: \"kubernetes.io/projected/54d90db9-7dd5-4012-9dd8-1fd612a4db38-kube-api-access-ckfll\") pod \"metallb-operator-webhook-server-695c4fbf6f-4nkqk\" (UID: \"54d90db9-7dd5-4012-9dd8-1fd612a4db38\") " pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.660171 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/54d90db9-7dd5-4012-9dd8-1fd612a4db38-apiservice-cert\") pod \"metallb-operator-webhook-server-695c4fbf6f-4nkqk\" (UID: \"54d90db9-7dd5-4012-9dd8-1fd612a4db38\") " pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.759330 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr"] Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.760750 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/54d90db9-7dd5-4012-9dd8-1fd612a4db38-apiservice-cert\") pod \"metallb-operator-webhook-server-695c4fbf6f-4nkqk\" (UID: \"54d90db9-7dd5-4012-9dd8-1fd612a4db38\") " pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.760809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/54d90db9-7dd5-4012-9dd8-1fd612a4db38-webhook-cert\") pod \"metallb-operator-webhook-server-695c4fbf6f-4nkqk\" (UID: \"54d90db9-7dd5-4012-9dd8-1fd612a4db38\") " pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.760857 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckfll\" (UniqueName: \"kubernetes.io/projected/54d90db9-7dd5-4012-9dd8-1fd612a4db38-kube-api-access-ckfll\") pod \"metallb-operator-webhook-server-695c4fbf6f-4nkqk\" (UID: \"54d90db9-7dd5-4012-9dd8-1fd612a4db38\") " pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:10:56 crc kubenswrapper[4778]: W1205 16:10:56.764420 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5f9a9c1_e0f5_4b56_b66e_c70686a83d57.slice/crio-5fb7ec556e5ec755bf814352afd7fa16e19e87d3977b18c875cb9ec157797fa7 WatchSource:0}: Error finding container 5fb7ec556e5ec755bf814352afd7fa16e19e87d3977b18c875cb9ec157797fa7: Status 404 returned error can't find the container with id 5fb7ec556e5ec755bf814352afd7fa16e19e87d3977b18c875cb9ec157797fa7 Dec 05 16:10:56 crc kubenswrapper[4778]: I1205 16:10:56.777626 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckfll\" (UniqueName: \"kubernetes.io/projected/54d90db9-7dd5-4012-9dd8-1fd612a4db38-kube-api-access-ckfll\") pod \"metallb-operator-webhook-server-695c4fbf6f-4nkqk\" (UID: \"54d90db9-7dd5-4012-9dd8-1fd612a4db38\") " pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:10:57 crc kubenswrapper[4778]: I1205 16:10:57.421255 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 05 16:10:57 crc kubenswrapper[4778]: I1205 16:10:57.426933 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/54d90db9-7dd5-4012-9dd8-1fd612a4db38-webhook-cert\") pod \"metallb-operator-webhook-server-695c4fbf6f-4nkqk\" (UID: \"54d90db9-7dd5-4012-9dd8-1fd612a4db38\") " pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:10:57 crc kubenswrapper[4778]: I1205 16:10:57.427227 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/54d90db9-7dd5-4012-9dd8-1fd612a4db38-apiservice-cert\") pod \"metallb-operator-webhook-server-695c4fbf6f-4nkqk\" (UID: \"54d90db9-7dd5-4012-9dd8-1fd612a4db38\") " pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:10:57 crc kubenswrapper[4778]: I1205 16:10:57.614869 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" event={"ID":"e5f9a9c1-e0f5-4b56-b66e-c70686a83d57","Type":"ContainerStarted","Data":"5fb7ec556e5ec755bf814352afd7fa16e19e87d3977b18c875cb9ec157797fa7"} Dec 05 16:10:57 crc kubenswrapper[4778]: I1205 16:10:57.642531 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 16:10:57 crc kubenswrapper[4778]: I1205 16:10:57.653695 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-gwp45" Dec 05 16:10:57 crc kubenswrapper[4778]: I1205 16:10:57.657138 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:10:58 crc kubenswrapper[4778]: I1205 16:10:58.149904 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk"] Dec 05 16:10:58 crc kubenswrapper[4778]: W1205 16:10:58.168292 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54d90db9_7dd5_4012_9dd8_1fd612a4db38.slice/crio-470c9f2e7eb78b4bbdc3bac94dbc852ec31ae23581090e499d13cbfaf63f56e4 WatchSource:0}: Error finding container 470c9f2e7eb78b4bbdc3bac94dbc852ec31ae23581090e499d13cbfaf63f56e4: Status 404 returned error can't find the container with id 470c9f2e7eb78b4bbdc3bac94dbc852ec31ae23581090e499d13cbfaf63f56e4 Dec 05 16:10:58 crc kubenswrapper[4778]: I1205 16:10:58.623862 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" event={"ID":"54d90db9-7dd5-4012-9dd8-1fd612a4db38","Type":"ContainerStarted","Data":"470c9f2e7eb78b4bbdc3bac94dbc852ec31ae23581090e499d13cbfaf63f56e4"} Dec 05 16:11:00 crc kubenswrapper[4778]: I1205 16:11:00.637239 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" event={"ID":"e5f9a9c1-e0f5-4b56-b66e-c70686a83d57","Type":"ContainerStarted","Data":"1f430df1cdda77a3cd27e67a0233a8a7b11c5b0ceab352f2c62668680c26292a"} Dec 05 16:11:00 crc kubenswrapper[4778]: I1205 16:11:00.637617 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:11:00 crc kubenswrapper[4778]: I1205 16:11:00.660482 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" podStartSLOduration=1.1855050870000001 podStartE2EDuration="4.660458733s" podCreationTimestamp="2025-12-05 16:10:56 +0000 UTC" firstStartedPulling="2025-12-05 16:10:56.768120476 +0000 UTC m=+943.871916856" lastFinishedPulling="2025-12-05 16:11:00.243074112 +0000 UTC m=+947.346870502" observedRunningTime="2025-12-05 16:11:00.655237046 +0000 UTC m=+947.759033426" watchObservedRunningTime="2025-12-05 16:11:00.660458733 +0000 UTC m=+947.764255133" Dec 05 16:11:03 crc kubenswrapper[4778]: I1205 16:11:03.414911 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:11:03 crc kubenswrapper[4778]: I1205 16:11:03.415318 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:11:03 crc kubenswrapper[4778]: I1205 16:11:03.415405 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:11:03 crc kubenswrapper[4778]: I1205 16:11:03.416163 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d05f43ec797c17341e7f030a46399a4fd0a9ce3922c28bd4bd201675fc830e2a"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:11:03 crc kubenswrapper[4778]: I1205 16:11:03.416281 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://d05f43ec797c17341e7f030a46399a4fd0a9ce3922c28bd4bd201675fc830e2a" gracePeriod=600 Dec 05 16:11:03 crc kubenswrapper[4778]: I1205 16:11:03.666975 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" event={"ID":"54d90db9-7dd5-4012-9dd8-1fd612a4db38","Type":"ContainerStarted","Data":"c737069b1917e2f7dc6b9fd882778ab6d87e5ce41cbac5dd4ceb64fdf6a9ae63"} Dec 05 16:11:03 crc kubenswrapper[4778]: I1205 16:11:03.667376 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:11:03 crc kubenswrapper[4778]: I1205 16:11:03.676129 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="d05f43ec797c17341e7f030a46399a4fd0a9ce3922c28bd4bd201675fc830e2a" exitCode=0 Dec 05 16:11:03 crc kubenswrapper[4778]: I1205 16:11:03.676189 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"d05f43ec797c17341e7f030a46399a4fd0a9ce3922c28bd4bd201675fc830e2a"} Dec 05 16:11:03 crc kubenswrapper[4778]: I1205 16:11:03.676219 4778 scope.go:117] "RemoveContainer" containerID="2f0eb734c5f784238f5df16ab2b9c81ee74d1805c2a6835ca18eb607dfb3dd7b" Dec 05 16:11:03 crc kubenswrapper[4778]: I1205 16:11:03.694377 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" podStartSLOduration=3.321028872 podStartE2EDuration="7.694340655s" podCreationTimestamp="2025-12-05 16:10:56 +0000 UTC" firstStartedPulling="2025-12-05 16:10:58.17222484 +0000 UTC m=+945.276021220" lastFinishedPulling="2025-12-05 16:11:02.545536623 +0000 UTC m=+949.649333003" observedRunningTime="2025-12-05 16:11:03.691834859 +0000 UTC m=+950.795631239" watchObservedRunningTime="2025-12-05 16:11:03.694340655 +0000 UTC m=+950.798137035" Dec 05 16:11:04 crc kubenswrapper[4778]: I1205 16:11:04.685042 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"aea0312e36d87c23ce634b679d6ae2137df783585ff65eb7e4e65c9564abd0b6"} Dec 05 16:11:17 crc kubenswrapper[4778]: I1205 16:11:17.662299 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-695c4fbf6f-4nkqk" Dec 05 16:11:36 crc kubenswrapper[4778]: I1205 16:11:36.418565 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-78b6b5f7d4-wmnmr" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.166301 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5lwxz"] Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.169075 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.171039 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.171215 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg"] Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.171746 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.171752 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.171792 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-zwgd4" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.175392 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.179610 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg"] Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.265484 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-qnv6q"] Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.266631 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qnv6q" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.268813 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.268899 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.268913 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-4jl76" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.271200 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.296661 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-ld99k"] Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.297745 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.299799 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.301434 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518d4b75-d756-48af-80a4-26e23ff4507b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-nz7bg\" (UID: \"518d4b75-d756-48af-80a4-26e23ff4507b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.301473 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-reloader\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.301496 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rnzf\" (UniqueName: \"kubernetes.io/projected/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-kube-api-access-4rnzf\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.301518 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-metrics\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.301549 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-frr-sockets\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.301587 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-frr-startup\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.301626 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b8bb\" (UniqueName: \"kubernetes.io/projected/518d4b75-d756-48af-80a4-26e23ff4507b-kube-api-access-9b8bb\") pod \"frr-k8s-webhook-server-7fcb986d4-nz7bg\" (UID: \"518d4b75-d756-48af-80a4-26e23ff4507b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.301709 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-frr-conf\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.301791 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-metrics-certs\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.329573 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-ld99k"] Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.402984 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/687c889e-852e-49f7-a18f-4992370c1829-cert\") pod \"controller-f8648f98b-ld99k\" (UID: \"687c889e-852e-49f7-a18f-4992370c1829\") " pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.403080 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518d4b75-d756-48af-80a4-26e23ff4507b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-nz7bg\" (UID: \"518d4b75-d756-48af-80a4-26e23ff4507b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.403103 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-reloader\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.403621 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-reloader\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.403901 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rnzf\" (UniqueName: \"kubernetes.io/projected/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-kube-api-access-4rnzf\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.403936 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-metrics\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.403955 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5spx6\" (UniqueName: \"kubernetes.io/projected/82af16e8-7640-4de9-bd7c-baaf968f7a98-kube-api-access-5spx6\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.403978 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/82af16e8-7640-4de9-bd7c-baaf968f7a98-metallb-excludel2\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.403997 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-frr-sockets\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.404011 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82af16e8-7640-4de9-bd7c-baaf968f7a98-memberlist\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.404042 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-frr-startup\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.404099 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b8bb\" (UniqueName: \"kubernetes.io/projected/518d4b75-d756-48af-80a4-26e23ff4507b-kube-api-access-9b8bb\") pod \"frr-k8s-webhook-server-7fcb986d4-nz7bg\" (UID: \"518d4b75-d756-48af-80a4-26e23ff4507b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.404114 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82af16e8-7640-4de9-bd7c-baaf968f7a98-metrics-certs\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.404130 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-frr-conf\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.404148 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh729\" (UniqueName: \"kubernetes.io/projected/687c889e-852e-49f7-a18f-4992370c1829-kube-api-access-dh729\") pod \"controller-f8648f98b-ld99k\" (UID: \"687c889e-852e-49f7-a18f-4992370c1829\") " pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.404171 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-metrics-certs\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.404196 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/687c889e-852e-49f7-a18f-4992370c1829-metrics-certs\") pod \"controller-f8648f98b-ld99k\" (UID: \"687c889e-852e-49f7-a18f-4992370c1829\") " pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.404642 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-metrics\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.404872 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-frr-sockets\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.404917 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-frr-startup\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.405094 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-frr-conf\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.410994 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-metrics-certs\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.423942 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rnzf\" (UniqueName: \"kubernetes.io/projected/c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6-kube-api-access-4rnzf\") pod \"frr-k8s-5lwxz\" (UID: \"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6\") " pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.425841 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518d4b75-d756-48af-80a4-26e23ff4507b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-nz7bg\" (UID: \"518d4b75-d756-48af-80a4-26e23ff4507b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.431096 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b8bb\" (UniqueName: \"kubernetes.io/projected/518d4b75-d756-48af-80a4-26e23ff4507b-kube-api-access-9b8bb\") pod \"frr-k8s-webhook-server-7fcb986d4-nz7bg\" (UID: \"518d4b75-d756-48af-80a4-26e23ff4507b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.491518 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.504387 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.505321 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82af16e8-7640-4de9-bd7c-baaf968f7a98-metrics-certs\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.505393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh729\" (UniqueName: \"kubernetes.io/projected/687c889e-852e-49f7-a18f-4992370c1829-kube-api-access-dh729\") pod \"controller-f8648f98b-ld99k\" (UID: \"687c889e-852e-49f7-a18f-4992370c1829\") " pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.505430 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/687c889e-852e-49f7-a18f-4992370c1829-metrics-certs\") pod \"controller-f8648f98b-ld99k\" (UID: \"687c889e-852e-49f7-a18f-4992370c1829\") " pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.505452 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/687c889e-852e-49f7-a18f-4992370c1829-cert\") pod \"controller-f8648f98b-ld99k\" (UID: \"687c889e-852e-49f7-a18f-4992370c1829\") " pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.505507 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5spx6\" (UniqueName: \"kubernetes.io/projected/82af16e8-7640-4de9-bd7c-baaf968f7a98-kube-api-access-5spx6\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.505531 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/82af16e8-7640-4de9-bd7c-baaf968f7a98-metallb-excludel2\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.505548 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82af16e8-7640-4de9-bd7c-baaf968f7a98-memberlist\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:37 crc kubenswrapper[4778]: E1205 16:11:37.505694 4778 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 16:11:37 crc kubenswrapper[4778]: E1205 16:11:37.505758 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82af16e8-7640-4de9-bd7c-baaf968f7a98-memberlist podName:82af16e8-7640-4de9-bd7c-baaf968f7a98 nodeName:}" failed. No retries permitted until 2025-12-05 16:11:38.00572555 +0000 UTC m=+985.109521930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/82af16e8-7640-4de9-bd7c-baaf968f7a98-memberlist") pod "speaker-qnv6q" (UID: "82af16e8-7640-4de9-bd7c-baaf968f7a98") : secret "metallb-memberlist" not found Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.507113 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/82af16e8-7640-4de9-bd7c-baaf968f7a98-metallb-excludel2\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.509857 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82af16e8-7640-4de9-bd7c-baaf968f7a98-metrics-certs\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.511828 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/687c889e-852e-49f7-a18f-4992370c1829-metrics-certs\") pod \"controller-f8648f98b-ld99k\" (UID: \"687c889e-852e-49f7-a18f-4992370c1829\") " pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.513584 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.524299 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh729\" (UniqueName: \"kubernetes.io/projected/687c889e-852e-49f7-a18f-4992370c1829-kube-api-access-dh729\") pod \"controller-f8648f98b-ld99k\" (UID: \"687c889e-852e-49f7-a18f-4992370c1829\") " pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.525891 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/687c889e-852e-49f7-a18f-4992370c1829-cert\") pod \"controller-f8648f98b-ld99k\" (UID: \"687c889e-852e-49f7-a18f-4992370c1829\") " pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.531961 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5spx6\" (UniqueName: \"kubernetes.io/projected/82af16e8-7640-4de9-bd7c-baaf968f7a98-kube-api-access-5spx6\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.639343 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.772057 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg"] Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.889406 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" event={"ID":"518d4b75-d756-48af-80a4-26e23ff4507b","Type":"ContainerStarted","Data":"98e9d199cd4f667bdbcdded18ccd8c2c37865cf5709a9e32810516863f2da310"} Dec 05 16:11:37 crc kubenswrapper[4778]: I1205 16:11:37.890636 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5lwxz" event={"ID":"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6","Type":"ContainerStarted","Data":"752b49a4f3478ef16efd34e0263d0497c99faae153e1a8d9921684b5a032a31a"} Dec 05 16:11:38 crc kubenswrapper[4778]: I1205 16:11:38.045241 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82af16e8-7640-4de9-bd7c-baaf968f7a98-memberlist\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:38 crc kubenswrapper[4778]: E1205 16:11:38.045510 4778 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 16:11:38 crc kubenswrapper[4778]: E1205 16:11:38.045628 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82af16e8-7640-4de9-bd7c-baaf968f7a98-memberlist podName:82af16e8-7640-4de9-bd7c-baaf968f7a98 nodeName:}" failed. No retries permitted until 2025-12-05 16:11:39.045601734 +0000 UTC m=+986.149398144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/82af16e8-7640-4de9-bd7c-baaf968f7a98-memberlist") pod "speaker-qnv6q" (UID: "82af16e8-7640-4de9-bd7c-baaf968f7a98") : secret "metallb-memberlist" not found Dec 05 16:11:38 crc kubenswrapper[4778]: I1205 16:11:38.080542 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-ld99k"] Dec 05 16:11:38 crc kubenswrapper[4778]: W1205 16:11:38.084872 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687c889e_852e_49f7_a18f_4992370c1829.slice/crio-24a127e7615bbab307aa4376d4dd91e4a6317bc1eb726fe1b8647bbb5833ad8f WatchSource:0}: Error finding container 24a127e7615bbab307aa4376d4dd91e4a6317bc1eb726fe1b8647bbb5833ad8f: Status 404 returned error can't find the container with id 24a127e7615bbab307aa4376d4dd91e4a6317bc1eb726fe1b8647bbb5833ad8f Dec 05 16:11:38 crc kubenswrapper[4778]: I1205 16:11:38.901944 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-ld99k" event={"ID":"687c889e-852e-49f7-a18f-4992370c1829","Type":"ContainerStarted","Data":"efe7cf0d94164de947eaef52707ef1e0db7e2553a26b783ade66e86808963795"} Dec 05 16:11:38 crc kubenswrapper[4778]: I1205 16:11:38.902273 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:38 crc kubenswrapper[4778]: I1205 16:11:38.902283 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-ld99k" event={"ID":"687c889e-852e-49f7-a18f-4992370c1829","Type":"ContainerStarted","Data":"4926fe2a51b0ae7b3a92b8fcdeb9b58e1413aed9671ae13d8f3c90aaec39cb78"} Dec 05 16:11:38 crc kubenswrapper[4778]: I1205 16:11:38.902292 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-ld99k" event={"ID":"687c889e-852e-49f7-a18f-4992370c1829","Type":"ContainerStarted","Data":"24a127e7615bbab307aa4376d4dd91e4a6317bc1eb726fe1b8647bbb5833ad8f"} Dec 05 16:11:38 crc kubenswrapper[4778]: I1205 16:11:38.919564 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-ld99k" podStartSLOduration=1.919543482 podStartE2EDuration="1.919543482s" podCreationTimestamp="2025-12-05 16:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:11:38.919050899 +0000 UTC m=+986.022847289" watchObservedRunningTime="2025-12-05 16:11:38.919543482 +0000 UTC m=+986.023339872" Dec 05 16:11:39 crc kubenswrapper[4778]: I1205 16:11:39.059593 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82af16e8-7640-4de9-bd7c-baaf968f7a98-memberlist\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:39 crc kubenswrapper[4778]: I1205 16:11:39.077303 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82af16e8-7640-4de9-bd7c-baaf968f7a98-memberlist\") pod \"speaker-qnv6q\" (UID: \"82af16e8-7640-4de9-bd7c-baaf968f7a98\") " pod="metallb-system/speaker-qnv6q" Dec 05 16:11:39 crc kubenswrapper[4778]: I1205 16:11:39.079961 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qnv6q" Dec 05 16:11:39 crc kubenswrapper[4778]: I1205 16:11:39.910595 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qnv6q" event={"ID":"82af16e8-7640-4de9-bd7c-baaf968f7a98","Type":"ContainerStarted","Data":"8a4a3f0d8f216d72302db8e6afb608b80130e1592f59c9a5250769608b96c45a"} Dec 05 16:11:39 crc kubenswrapper[4778]: I1205 16:11:39.910643 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qnv6q" event={"ID":"82af16e8-7640-4de9-bd7c-baaf968f7a98","Type":"ContainerStarted","Data":"501f33c4d24b2f7ef5b72e42c875fd02093f8925cd01981223633f433105144f"} Dec 05 16:11:39 crc kubenswrapper[4778]: I1205 16:11:39.910655 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qnv6q" event={"ID":"82af16e8-7640-4de9-bd7c-baaf968f7a98","Type":"ContainerStarted","Data":"d0f7002e0480a396152c70aab299fbaa017a8b4b664ff6b0c411dfdf80514ac9"} Dec 05 16:11:39 crc kubenswrapper[4778]: I1205 16:11:39.910939 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-qnv6q" Dec 05 16:11:39 crc kubenswrapper[4778]: I1205 16:11:39.934004 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-qnv6q" podStartSLOduration=2.933985341 podStartE2EDuration="2.933985341s" podCreationTimestamp="2025-12-05 16:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:11:39.92950277 +0000 UTC m=+987.033299150" watchObservedRunningTime="2025-12-05 16:11:39.933985341 +0000 UTC m=+987.037781721" Dec 05 16:11:47 crc kubenswrapper[4778]: I1205 16:11:47.967568 4778 generic.go:334] "Generic (PLEG): container finished" podID="c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6" containerID="d8a9a6d3241e51ab18606d6b88bc372e4d2245840f1628237ea8cdc96b3b043d" exitCode=0 Dec 05 16:11:47 crc kubenswrapper[4778]: I1205 16:11:47.967631 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5lwxz" event={"ID":"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6","Type":"ContainerDied","Data":"d8a9a6d3241e51ab18606d6b88bc372e4d2245840f1628237ea8cdc96b3b043d"} Dec 05 16:11:47 crc kubenswrapper[4778]: I1205 16:11:47.969894 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" event={"ID":"518d4b75-d756-48af-80a4-26e23ff4507b","Type":"ContainerStarted","Data":"ec1fe2a96e4aa3811be176f4581af3f2973c64a5cc5e3f4e0feb2981ecbf41fa"} Dec 05 16:11:47 crc kubenswrapper[4778]: I1205 16:11:47.970100 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" Dec 05 16:11:48 crc kubenswrapper[4778]: I1205 16:11:48.016336 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" podStartSLOduration=1.6840741320000001 podStartE2EDuration="11.016319035s" podCreationTimestamp="2025-12-05 16:11:37 +0000 UTC" firstStartedPulling="2025-12-05 16:11:37.7865612 +0000 UTC m=+984.890357580" lastFinishedPulling="2025-12-05 16:11:47.118806103 +0000 UTC m=+994.222602483" observedRunningTime="2025-12-05 16:11:48.01430606 +0000 UTC m=+995.118102440" watchObservedRunningTime="2025-12-05 16:11:48.016319035 +0000 UTC m=+995.120115425" Dec 05 16:11:48 crc kubenswrapper[4778]: I1205 16:11:48.979774 4778 generic.go:334] "Generic (PLEG): container finished" podID="c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6" containerID="2fb9cdc54718940725e99e576f6221a3331685b831f87bfeb6e332bc043332ff" exitCode=0 Dec 05 16:11:48 crc kubenswrapper[4778]: I1205 16:11:48.979915 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5lwxz" event={"ID":"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6","Type":"ContainerDied","Data":"2fb9cdc54718940725e99e576f6221a3331685b831f87bfeb6e332bc043332ff"} Dec 05 16:11:49 crc kubenswrapper[4778]: I1205 16:11:49.088341 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-qnv6q" Dec 05 16:11:49 crc kubenswrapper[4778]: I1205 16:11:49.987669 4778 generic.go:334] "Generic (PLEG): container finished" podID="c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6" containerID="44a723ad4301fa5d7f1452c7bc4c04c7eb0ff666b2d96d7fb4fcc8a5f8e3f0dc" exitCode=0 Dec 05 16:11:49 crc kubenswrapper[4778]: I1205 16:11:49.987778 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5lwxz" event={"ID":"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6","Type":"ContainerDied","Data":"44a723ad4301fa5d7f1452c7bc4c04c7eb0ff666b2d96d7fb4fcc8a5f8e3f0dc"} Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.600113 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt"] Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.602418 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.605060 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.620941 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2210330e-e3d5-4777-ad4f-61aa0a94f73a-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt\" (UID: \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.621020 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2210330e-e3d5-4777-ad4f-61aa0a94f73a-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt\" (UID: \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.621057 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k4jr\" (UniqueName: \"kubernetes.io/projected/2210330e-e3d5-4777-ad4f-61aa0a94f73a-kube-api-access-6k4jr\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt\" (UID: \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.624902 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt"] Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.722044 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2210330e-e3d5-4777-ad4f-61aa0a94f73a-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt\" (UID: \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.722142 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2210330e-e3d5-4777-ad4f-61aa0a94f73a-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt\" (UID: \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.722180 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k4jr\" (UniqueName: \"kubernetes.io/projected/2210330e-e3d5-4777-ad4f-61aa0a94f73a-kube-api-access-6k4jr\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt\" (UID: \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.722675 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2210330e-e3d5-4777-ad4f-61aa0a94f73a-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt\" (UID: \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.722905 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2210330e-e3d5-4777-ad4f-61aa0a94f73a-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt\" (UID: \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.744041 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k4jr\" (UniqueName: \"kubernetes.io/projected/2210330e-e3d5-4777-ad4f-61aa0a94f73a-kube-api-access-6k4jr\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt\" (UID: \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.926812 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:11:50 crc kubenswrapper[4778]: I1205 16:11:50.997788 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5lwxz" event={"ID":"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6","Type":"ContainerStarted","Data":"13384a0ee758ce8043ae8b9411d34aa449befd3c724971fd3787a40ae43d6602"} Dec 05 16:11:51 crc kubenswrapper[4778]: I1205 16:11:51.166350 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt"] Dec 05 16:11:51 crc kubenswrapper[4778]: W1205 16:11:51.170838 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2210330e_e3d5_4777_ad4f_61aa0a94f73a.slice/crio-35e8a039ec77da9e5fe1f2e7bc7db65e2359a3dbc4cdb0de8f680679ff3f7b05 WatchSource:0}: Error finding container 35e8a039ec77da9e5fe1f2e7bc7db65e2359a3dbc4cdb0de8f680679ff3f7b05: Status 404 returned error can't find the container with id 35e8a039ec77da9e5fe1f2e7bc7db65e2359a3dbc4cdb0de8f680679ff3f7b05 Dec 05 16:11:52 crc kubenswrapper[4778]: I1205 16:11:52.007872 4778 generic.go:334] "Generic (PLEG): container finished" podID="2210330e-e3d5-4777-ad4f-61aa0a94f73a" containerID="55b2a76829ef244b4b874a288ece92a80852c5477b7492213a728fde63611aaa" exitCode=0 Dec 05 16:11:52 crc kubenswrapper[4778]: I1205 16:11:52.008023 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" event={"ID":"2210330e-e3d5-4777-ad4f-61aa0a94f73a","Type":"ContainerDied","Data":"55b2a76829ef244b4b874a288ece92a80852c5477b7492213a728fde63611aaa"} Dec 05 16:11:52 crc kubenswrapper[4778]: I1205 16:11:52.008251 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" event={"ID":"2210330e-e3d5-4777-ad4f-61aa0a94f73a","Type":"ContainerStarted","Data":"35e8a039ec77da9e5fe1f2e7bc7db65e2359a3dbc4cdb0de8f680679ff3f7b05"} Dec 05 16:11:52 crc kubenswrapper[4778]: I1205 16:11:52.015401 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5lwxz" event={"ID":"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6","Type":"ContainerStarted","Data":"ba51b71e4dadfaf578b175a603ae2fcd982f160d0034447d0c1c1ac2675ea91e"} Dec 05 16:11:52 crc kubenswrapper[4778]: I1205 16:11:52.015457 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5lwxz" event={"ID":"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6","Type":"ContainerStarted","Data":"318f27c40f5406ea3555a59917796e17a67766ef7b4d4cb15f23d86416530e58"} Dec 05 16:11:52 crc kubenswrapper[4778]: I1205 16:11:52.015475 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5lwxz" event={"ID":"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6","Type":"ContainerStarted","Data":"891170ac844c1d3a001c5ba40facb4ff9895cc4b5dfc23e32c434341ff959e7f"} Dec 05 16:11:53 crc kubenswrapper[4778]: I1205 16:11:53.026766 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5lwxz" event={"ID":"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6","Type":"ContainerStarted","Data":"09c05eec5cd8ef813f2b21a67d2bacf3898ac9ebbe1f4e320b2212809f941659"} Dec 05 16:11:53 crc kubenswrapper[4778]: I1205 16:11:53.027089 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:53 crc kubenswrapper[4778]: I1205 16:11:53.027105 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5lwxz" event={"ID":"c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6","Type":"ContainerStarted","Data":"67f177284503ad8a8d764a0bd6c630f8ec8e3f27b3de0f5b40428df1858a4464"} Dec 05 16:11:53 crc kubenswrapper[4778]: I1205 16:11:53.063628 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5lwxz" podStartSLOduration=6.592712467 podStartE2EDuration="16.063612223s" podCreationTimestamp="2025-12-05 16:11:37 +0000 UTC" firstStartedPulling="2025-12-05 16:11:37.730737087 +0000 UTC m=+984.834533477" lastFinishedPulling="2025-12-05 16:11:47.201636853 +0000 UTC m=+994.305433233" observedRunningTime="2025-12-05 16:11:53.060260712 +0000 UTC m=+1000.164057102" watchObservedRunningTime="2025-12-05 16:11:53.063612223 +0000 UTC m=+1000.167408603" Dec 05 16:11:56 crc kubenswrapper[4778]: I1205 16:11:56.048616 4778 generic.go:334] "Generic (PLEG): container finished" podID="2210330e-e3d5-4777-ad4f-61aa0a94f73a" containerID="b12d982783661fb7507faa2322415efd963533521211b52f66e334e18dcb9aed" exitCode=0 Dec 05 16:11:56 crc kubenswrapper[4778]: I1205 16:11:56.048751 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" event={"ID":"2210330e-e3d5-4777-ad4f-61aa0a94f73a","Type":"ContainerDied","Data":"b12d982783661fb7507faa2322415efd963533521211b52f66e334e18dcb9aed"} Dec 05 16:11:57 crc kubenswrapper[4778]: I1205 16:11:57.057439 4778 generic.go:334] "Generic (PLEG): container finished" podID="2210330e-e3d5-4777-ad4f-61aa0a94f73a" containerID="d5888bdd46e5226b26b2a150180415637e9b076e49e24b30ef45f25532e56a40" exitCode=0 Dec 05 16:11:57 crc kubenswrapper[4778]: I1205 16:11:57.057548 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" event={"ID":"2210330e-e3d5-4777-ad4f-61aa0a94f73a","Type":"ContainerDied","Data":"d5888bdd46e5226b26b2a150180415637e9b076e49e24b30ef45f25532e56a40"} Dec 05 16:11:57 crc kubenswrapper[4778]: I1205 16:11:57.492344 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:57 crc kubenswrapper[4778]: I1205 16:11:57.515796 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-nz7bg" Dec 05 16:11:57 crc kubenswrapper[4778]: I1205 16:11:57.551202 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:11:57 crc kubenswrapper[4778]: I1205 16:11:57.644466 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-ld99k" Dec 05 16:11:58 crc kubenswrapper[4778]: I1205 16:11:58.353779 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:11:58 crc kubenswrapper[4778]: I1205 16:11:58.526691 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2210330e-e3d5-4777-ad4f-61aa0a94f73a-util\") pod \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\" (UID: \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\") " Dec 05 16:11:58 crc kubenswrapper[4778]: I1205 16:11:58.526785 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2210330e-e3d5-4777-ad4f-61aa0a94f73a-bundle\") pod \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\" (UID: \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\") " Dec 05 16:11:58 crc kubenswrapper[4778]: I1205 16:11:58.526826 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k4jr\" (UniqueName: \"kubernetes.io/projected/2210330e-e3d5-4777-ad4f-61aa0a94f73a-kube-api-access-6k4jr\") pod \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\" (UID: \"2210330e-e3d5-4777-ad4f-61aa0a94f73a\") " Dec 05 16:11:58 crc kubenswrapper[4778]: I1205 16:11:58.528188 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2210330e-e3d5-4777-ad4f-61aa0a94f73a-bundle" (OuterVolumeSpecName: "bundle") pod "2210330e-e3d5-4777-ad4f-61aa0a94f73a" (UID: "2210330e-e3d5-4777-ad4f-61aa0a94f73a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:11:58 crc kubenswrapper[4778]: I1205 16:11:58.538235 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2210330e-e3d5-4777-ad4f-61aa0a94f73a-kube-api-access-6k4jr" (OuterVolumeSpecName: "kube-api-access-6k4jr") pod "2210330e-e3d5-4777-ad4f-61aa0a94f73a" (UID: "2210330e-e3d5-4777-ad4f-61aa0a94f73a"). InnerVolumeSpecName "kube-api-access-6k4jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:11:58 crc kubenswrapper[4778]: I1205 16:11:58.542643 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2210330e-e3d5-4777-ad4f-61aa0a94f73a-util" (OuterVolumeSpecName: "util") pod "2210330e-e3d5-4777-ad4f-61aa0a94f73a" (UID: "2210330e-e3d5-4777-ad4f-61aa0a94f73a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:11:58 crc kubenswrapper[4778]: I1205 16:11:58.628865 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2210330e-e3d5-4777-ad4f-61aa0a94f73a-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:11:58 crc kubenswrapper[4778]: I1205 16:11:58.628912 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k4jr\" (UniqueName: \"kubernetes.io/projected/2210330e-e3d5-4777-ad4f-61aa0a94f73a-kube-api-access-6k4jr\") on node \"crc\" DevicePath \"\"" Dec 05 16:11:58 crc kubenswrapper[4778]: I1205 16:11:58.628936 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2210330e-e3d5-4777-ad4f-61aa0a94f73a-util\") on node \"crc\" DevicePath \"\"" Dec 05 16:11:59 crc kubenswrapper[4778]: I1205 16:11:59.074106 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" event={"ID":"2210330e-e3d5-4777-ad4f-61aa0a94f73a","Type":"ContainerDied","Data":"35e8a039ec77da9e5fe1f2e7bc7db65e2359a3dbc4cdb0de8f680679ff3f7b05"} Dec 05 16:11:59 crc kubenswrapper[4778]: I1205 16:11:59.074153 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35e8a039ec77da9e5fe1f2e7bc7db65e2359a3dbc4cdb0de8f680679ff3f7b05" Dec 05 16:11:59 crc kubenswrapper[4778]: I1205 16:11:59.074238 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt" Dec 05 16:12:03 crc kubenswrapper[4778]: I1205 16:12:03.784995 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk"] Dec 05 16:12:03 crc kubenswrapper[4778]: E1205 16:12:03.785526 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2210330e-e3d5-4777-ad4f-61aa0a94f73a" containerName="extract" Dec 05 16:12:03 crc kubenswrapper[4778]: I1205 16:12:03.785538 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2210330e-e3d5-4777-ad4f-61aa0a94f73a" containerName="extract" Dec 05 16:12:03 crc kubenswrapper[4778]: E1205 16:12:03.785549 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2210330e-e3d5-4777-ad4f-61aa0a94f73a" containerName="util" Dec 05 16:12:03 crc kubenswrapper[4778]: I1205 16:12:03.785555 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2210330e-e3d5-4777-ad4f-61aa0a94f73a" containerName="util" Dec 05 16:12:03 crc kubenswrapper[4778]: E1205 16:12:03.785567 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2210330e-e3d5-4777-ad4f-61aa0a94f73a" containerName="pull" Dec 05 16:12:03 crc kubenswrapper[4778]: I1205 16:12:03.785573 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2210330e-e3d5-4777-ad4f-61aa0a94f73a" containerName="pull" Dec 05 16:12:03 crc kubenswrapper[4778]: I1205 16:12:03.785672 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2210330e-e3d5-4777-ad4f-61aa0a94f73a" containerName="extract" Dec 05 16:12:03 crc kubenswrapper[4778]: I1205 16:12:03.786176 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk" Dec 05 16:12:03 crc kubenswrapper[4778]: I1205 16:12:03.806548 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 05 16:12:03 crc kubenswrapper[4778]: I1205 16:12:03.806570 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-vdqkk" Dec 05 16:12:03 crc kubenswrapper[4778]: I1205 16:12:03.806689 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 05 16:12:03 crc kubenswrapper[4778]: I1205 16:12:03.880422 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk"] Dec 05 16:12:03 crc kubenswrapper[4778]: I1205 16:12:03.907476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pxr6\" (UniqueName: \"kubernetes.io/projected/7ef7f642-6817-4949-81b7-40d39c037071-kube-api-access-4pxr6\") pod \"cert-manager-operator-controller-manager-64cf6dff88-ktvxk\" (UID: \"7ef7f642-6817-4949-81b7-40d39c037071\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk" Dec 05 16:12:03 crc kubenswrapper[4778]: I1205 16:12:03.907596 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7ef7f642-6817-4949-81b7-40d39c037071-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-ktvxk\" (UID: \"7ef7f642-6817-4949-81b7-40d39c037071\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk" Dec 05 16:12:04 crc kubenswrapper[4778]: I1205 16:12:04.008527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7ef7f642-6817-4949-81b7-40d39c037071-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-ktvxk\" (UID: \"7ef7f642-6817-4949-81b7-40d39c037071\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk" Dec 05 16:12:04 crc kubenswrapper[4778]: I1205 16:12:04.008633 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pxr6\" (UniqueName: \"kubernetes.io/projected/7ef7f642-6817-4949-81b7-40d39c037071-kube-api-access-4pxr6\") pod \"cert-manager-operator-controller-manager-64cf6dff88-ktvxk\" (UID: \"7ef7f642-6817-4949-81b7-40d39c037071\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk" Dec 05 16:12:04 crc kubenswrapper[4778]: I1205 16:12:04.009090 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7ef7f642-6817-4949-81b7-40d39c037071-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-ktvxk\" (UID: \"7ef7f642-6817-4949-81b7-40d39c037071\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk" Dec 05 16:12:04 crc kubenswrapper[4778]: I1205 16:12:04.033643 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pxr6\" (UniqueName: \"kubernetes.io/projected/7ef7f642-6817-4949-81b7-40d39c037071-kube-api-access-4pxr6\") pod \"cert-manager-operator-controller-manager-64cf6dff88-ktvxk\" (UID: \"7ef7f642-6817-4949-81b7-40d39c037071\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk" Dec 05 16:12:04 crc kubenswrapper[4778]: I1205 16:12:04.119615 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk" Dec 05 16:12:04 crc kubenswrapper[4778]: I1205 16:12:04.534006 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk"] Dec 05 16:12:04 crc kubenswrapper[4778]: W1205 16:12:04.541714 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ef7f642_6817_4949_81b7_40d39c037071.slice/crio-a3f5ffaa9bafe82c465f21b8fe404437a15e7092df1507ec5abfd83cc69d678a WatchSource:0}: Error finding container a3f5ffaa9bafe82c465f21b8fe404437a15e7092df1507ec5abfd83cc69d678a: Status 404 returned error can't find the container with id a3f5ffaa9bafe82c465f21b8fe404437a15e7092df1507ec5abfd83cc69d678a Dec 05 16:12:05 crc kubenswrapper[4778]: I1205 16:12:05.109821 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk" event={"ID":"7ef7f642-6817-4949-81b7-40d39c037071","Type":"ContainerStarted","Data":"a3f5ffaa9bafe82c465f21b8fe404437a15e7092df1507ec5abfd83cc69d678a"} Dec 05 16:12:07 crc kubenswrapper[4778]: I1205 16:12:07.495008 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5lwxz" Dec 05 16:12:08 crc kubenswrapper[4778]: I1205 16:12:08.131632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk" event={"ID":"7ef7f642-6817-4949-81b7-40d39c037071","Type":"ContainerStarted","Data":"66df60d5e6652396b4b320ac1e7332d353b019e94d35b655b2c6e79873118675"} Dec 05 16:12:08 crc kubenswrapper[4778]: I1205 16:12:08.158161 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-ktvxk" podStartSLOduration=2.222626055 podStartE2EDuration="5.158140261s" podCreationTimestamp="2025-12-05 16:12:03 +0000 UTC" firstStartedPulling="2025-12-05 16:12:04.543945314 +0000 UTC m=+1011.647741704" lastFinishedPulling="2025-12-05 16:12:07.47945952 +0000 UTC m=+1014.583255910" observedRunningTime="2025-12-05 16:12:08.154879053 +0000 UTC m=+1015.258675433" watchObservedRunningTime="2025-12-05 16:12:08.158140261 +0000 UTC m=+1015.261936641" Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.394353 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-stl96"] Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.396201 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.398429 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.399171 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.399289 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-5j7jb" Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.414739 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-stl96"] Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.532594 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf7qg\" (UniqueName: \"kubernetes.io/projected/81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0-kube-api-access-qf7qg\") pod \"cert-manager-webhook-f4fb5df64-stl96\" (UID: \"81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.532730 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-stl96\" (UID: \"81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.634395 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf7qg\" (UniqueName: \"kubernetes.io/projected/81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0-kube-api-access-qf7qg\") pod \"cert-manager-webhook-f4fb5df64-stl96\" (UID: \"81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.634504 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-stl96\" (UID: \"81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.656723 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf7qg\" (UniqueName: \"kubernetes.io/projected/81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0-kube-api-access-qf7qg\") pod \"cert-manager-webhook-f4fb5df64-stl96\" (UID: \"81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.660443 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-stl96\" (UID: \"81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.747996 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" Dec 05 16:12:12 crc kubenswrapper[4778]: I1205 16:12:12.957794 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-stl96"] Dec 05 16:12:13 crc kubenswrapper[4778]: I1205 16:12:13.176548 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" event={"ID":"81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0","Type":"ContainerStarted","Data":"dc498cb8f8559f54f19a2abfb07f8751bba24b929b7156a0df1dfdcbc7136f11"} Dec 05 16:12:17 crc kubenswrapper[4778]: I1205 16:12:17.729530 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw"] Dec 05 16:12:17 crc kubenswrapper[4778]: I1205 16:12:17.730824 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw" Dec 05 16:12:17 crc kubenswrapper[4778]: I1205 16:12:17.732576 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-w2vv5" Dec 05 16:12:17 crc kubenswrapper[4778]: I1205 16:12:17.739846 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw"] Dec 05 16:12:17 crc kubenswrapper[4778]: I1205 16:12:17.742675 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21fa2092-e29f-4bf1-b446-1561e7c2c35b-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-f6qgw\" (UID: \"21fa2092-e29f-4bf1-b446-1561e7c2c35b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw" Dec 05 16:12:17 crc kubenswrapper[4778]: I1205 16:12:17.742734 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t77pd\" (UniqueName: \"kubernetes.io/projected/21fa2092-e29f-4bf1-b446-1561e7c2c35b-kube-api-access-t77pd\") pod \"cert-manager-cainjector-855d9ccff4-f6qgw\" (UID: \"21fa2092-e29f-4bf1-b446-1561e7c2c35b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw" Dec 05 16:12:17 crc kubenswrapper[4778]: I1205 16:12:17.844375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21fa2092-e29f-4bf1-b446-1561e7c2c35b-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-f6qgw\" (UID: \"21fa2092-e29f-4bf1-b446-1561e7c2c35b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw" Dec 05 16:12:17 crc kubenswrapper[4778]: I1205 16:12:17.844441 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t77pd\" (UniqueName: \"kubernetes.io/projected/21fa2092-e29f-4bf1-b446-1561e7c2c35b-kube-api-access-t77pd\") pod \"cert-manager-cainjector-855d9ccff4-f6qgw\" (UID: \"21fa2092-e29f-4bf1-b446-1561e7c2c35b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw" Dec 05 16:12:17 crc kubenswrapper[4778]: I1205 16:12:17.862664 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t77pd\" (UniqueName: \"kubernetes.io/projected/21fa2092-e29f-4bf1-b446-1561e7c2c35b-kube-api-access-t77pd\") pod \"cert-manager-cainjector-855d9ccff4-f6qgw\" (UID: \"21fa2092-e29f-4bf1-b446-1561e7c2c35b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw" Dec 05 16:12:17 crc kubenswrapper[4778]: I1205 16:12:17.862815 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21fa2092-e29f-4bf1-b446-1561e7c2c35b-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-f6qgw\" (UID: \"21fa2092-e29f-4bf1-b446-1561e7c2c35b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw" Dec 05 16:12:18 crc kubenswrapper[4778]: I1205 16:12:18.095334 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw" Dec 05 16:12:20 crc kubenswrapper[4778]: I1205 16:12:20.108526 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw"] Dec 05 16:12:20 crc kubenswrapper[4778]: I1205 16:12:20.231036 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw" event={"ID":"21fa2092-e29f-4bf1-b446-1561e7c2c35b","Type":"ContainerStarted","Data":"b3fb85602d95b769d9c39cb15630288498bf7f1514ce0e30b103712267058d95"} Dec 05 16:12:20 crc kubenswrapper[4778]: I1205 16:12:20.232394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" event={"ID":"81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0","Type":"ContainerStarted","Data":"cb337a0c9029981256b816feb00fd7e3dfda39df34eea82780edaaeafe25b264"} Dec 05 16:12:20 crc kubenswrapper[4778]: I1205 16:12:20.232537 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" Dec 05 16:12:20 crc kubenswrapper[4778]: I1205 16:12:20.255862 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" podStartSLOduration=1.354442749 podStartE2EDuration="8.255827891s" podCreationTimestamp="2025-12-05 16:12:12 +0000 UTC" firstStartedPulling="2025-12-05 16:12:12.968323065 +0000 UTC m=+1020.072119445" lastFinishedPulling="2025-12-05 16:12:19.869708217 +0000 UTC m=+1026.973504587" observedRunningTime="2025-12-05 16:12:20.245881414 +0000 UTC m=+1027.349677794" watchObservedRunningTime="2025-12-05 16:12:20.255827891 +0000 UTC m=+1027.359624311" Dec 05 16:12:21 crc kubenswrapper[4778]: I1205 16:12:21.240051 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw" event={"ID":"21fa2092-e29f-4bf1-b446-1561e7c2c35b","Type":"ContainerStarted","Data":"3025944a2dfc41dd6e6450ef4bc93b908d2fc9b311017cf88fdf1ac8bcd3da2e"} Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.395821 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-f6qgw" podStartSLOduration=5.395800781 podStartE2EDuration="5.395800781s" podCreationTimestamp="2025-12-05 16:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:12:21.26912024 +0000 UTC m=+1028.372916650" watchObservedRunningTime="2025-12-05 16:12:22.395800781 +0000 UTC m=+1029.499597161" Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.401685 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-j8xvc"] Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.402554 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-j8xvc" Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.404284 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-d4jjd" Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.417095 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-j8xvc"] Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.514992 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spkx4\" (UniqueName: \"kubernetes.io/projected/db44b5fa-a7f8-4aac-bf9a-5669e0fad581-kube-api-access-spkx4\") pod \"cert-manager-86cb77c54b-j8xvc\" (UID: \"db44b5fa-a7f8-4aac-bf9a-5669e0fad581\") " pod="cert-manager/cert-manager-86cb77c54b-j8xvc" Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.515060 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db44b5fa-a7f8-4aac-bf9a-5669e0fad581-bound-sa-token\") pod \"cert-manager-86cb77c54b-j8xvc\" (UID: \"db44b5fa-a7f8-4aac-bf9a-5669e0fad581\") " pod="cert-manager/cert-manager-86cb77c54b-j8xvc" Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.616079 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db44b5fa-a7f8-4aac-bf9a-5669e0fad581-bound-sa-token\") pod \"cert-manager-86cb77c54b-j8xvc\" (UID: \"db44b5fa-a7f8-4aac-bf9a-5669e0fad581\") " pod="cert-manager/cert-manager-86cb77c54b-j8xvc" Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.616261 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spkx4\" (UniqueName: \"kubernetes.io/projected/db44b5fa-a7f8-4aac-bf9a-5669e0fad581-kube-api-access-spkx4\") pod \"cert-manager-86cb77c54b-j8xvc\" (UID: \"db44b5fa-a7f8-4aac-bf9a-5669e0fad581\") " pod="cert-manager/cert-manager-86cb77c54b-j8xvc" Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.640930 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spkx4\" (UniqueName: \"kubernetes.io/projected/db44b5fa-a7f8-4aac-bf9a-5669e0fad581-kube-api-access-spkx4\") pod \"cert-manager-86cb77c54b-j8xvc\" (UID: \"db44b5fa-a7f8-4aac-bf9a-5669e0fad581\") " pod="cert-manager/cert-manager-86cb77c54b-j8xvc" Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.643967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db44b5fa-a7f8-4aac-bf9a-5669e0fad581-bound-sa-token\") pod \"cert-manager-86cb77c54b-j8xvc\" (UID: \"db44b5fa-a7f8-4aac-bf9a-5669e0fad581\") " pod="cert-manager/cert-manager-86cb77c54b-j8xvc" Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.740167 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-j8xvc" Dec 05 16:12:22 crc kubenswrapper[4778]: I1205 16:12:22.925764 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-j8xvc"] Dec 05 16:12:22 crc kubenswrapper[4778]: W1205 16:12:22.929471 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb44b5fa_a7f8_4aac_bf9a_5669e0fad581.slice/crio-625d604195d3013b1108c3bfacc967f134912e62efb64ef8afb7e294dcdbfbec WatchSource:0}: Error finding container 625d604195d3013b1108c3bfacc967f134912e62efb64ef8afb7e294dcdbfbec: Status 404 returned error can't find the container with id 625d604195d3013b1108c3bfacc967f134912e62efb64ef8afb7e294dcdbfbec Dec 05 16:12:23 crc kubenswrapper[4778]: I1205 16:12:23.261636 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-j8xvc" event={"ID":"db44b5fa-a7f8-4aac-bf9a-5669e0fad581","Type":"ContainerStarted","Data":"baeee59396ccc2e82063071e4e2ac1816240a096e8a3c3e06d33dcdff32a0af0"} Dec 05 16:12:23 crc kubenswrapper[4778]: I1205 16:12:23.261699 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-j8xvc" event={"ID":"db44b5fa-a7f8-4aac-bf9a-5669e0fad581","Type":"ContainerStarted","Data":"625d604195d3013b1108c3bfacc967f134912e62efb64ef8afb7e294dcdbfbec"} Dec 05 16:12:23 crc kubenswrapper[4778]: I1205 16:12:23.296580 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-j8xvc" podStartSLOduration=1.296545641 podStartE2EDuration="1.296545641s" podCreationTimestamp="2025-12-05 16:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:12:23.291969957 +0000 UTC m=+1030.395766357" watchObservedRunningTime="2025-12-05 16:12:23.296545641 +0000 UTC m=+1030.400342111" Dec 05 16:12:27 crc kubenswrapper[4778]: I1205 16:12:27.752644 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-stl96" Dec 05 16:12:31 crc kubenswrapper[4778]: I1205 16:12:31.031640 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-flgjp"] Dec 05 16:12:31 crc kubenswrapper[4778]: I1205 16:12:31.033040 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-flgjp" Dec 05 16:12:31 crc kubenswrapper[4778]: I1205 16:12:31.036214 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 05 16:12:31 crc kubenswrapper[4778]: I1205 16:12:31.036265 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6jbng" Dec 05 16:12:31 crc kubenswrapper[4778]: I1205 16:12:31.042580 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 05 16:12:31 crc kubenswrapper[4778]: I1205 16:12:31.053491 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-flgjp"] Dec 05 16:12:31 crc kubenswrapper[4778]: I1205 16:12:31.137282 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhzbv\" (UniqueName: \"kubernetes.io/projected/b6ac2f11-7127-47bd-bd32-904e8383126b-kube-api-access-zhzbv\") pod \"openstack-operator-index-flgjp\" (UID: \"b6ac2f11-7127-47bd-bd32-904e8383126b\") " pod="openstack-operators/openstack-operator-index-flgjp" Dec 05 16:12:31 crc kubenswrapper[4778]: I1205 16:12:31.239005 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhzbv\" (UniqueName: \"kubernetes.io/projected/b6ac2f11-7127-47bd-bd32-904e8383126b-kube-api-access-zhzbv\") pod \"openstack-operator-index-flgjp\" (UID: \"b6ac2f11-7127-47bd-bd32-904e8383126b\") " pod="openstack-operators/openstack-operator-index-flgjp" Dec 05 16:12:31 crc kubenswrapper[4778]: I1205 16:12:31.259923 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhzbv\" (UniqueName: \"kubernetes.io/projected/b6ac2f11-7127-47bd-bd32-904e8383126b-kube-api-access-zhzbv\") pod \"openstack-operator-index-flgjp\" (UID: \"b6ac2f11-7127-47bd-bd32-904e8383126b\") " pod="openstack-operators/openstack-operator-index-flgjp" Dec 05 16:12:31 crc kubenswrapper[4778]: I1205 16:12:31.359479 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-flgjp" Dec 05 16:12:31 crc kubenswrapper[4778]: I1205 16:12:31.611932 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-flgjp"] Dec 05 16:12:32 crc kubenswrapper[4778]: I1205 16:12:32.315649 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-flgjp" event={"ID":"b6ac2f11-7127-47bd-bd32-904e8383126b","Type":"ContainerStarted","Data":"95401adec257cdb3e9bc88e9c4cd96fe105223587ed1323786598ec40ea84c0e"} Dec 05 16:12:34 crc kubenswrapper[4778]: I1205 16:12:34.329861 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-flgjp" event={"ID":"b6ac2f11-7127-47bd-bd32-904e8383126b","Type":"ContainerStarted","Data":"729a680c2ec320debd77262faf9d6d31f9fd0e03008255df36e0d63166887d69"} Dec 05 16:12:34 crc kubenswrapper[4778]: I1205 16:12:34.349301 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-flgjp" podStartSLOduration=0.995346827 podStartE2EDuration="3.349275767s" podCreationTimestamp="2025-12-05 16:12:31 +0000 UTC" firstStartedPulling="2025-12-05 16:12:31.621048888 +0000 UTC m=+1038.724845268" lastFinishedPulling="2025-12-05 16:12:33.974977828 +0000 UTC m=+1041.078774208" observedRunningTime="2025-12-05 16:12:34.346621686 +0000 UTC m=+1041.450418116" watchObservedRunningTime="2025-12-05 16:12:34.349275767 +0000 UTC m=+1041.453072187" Dec 05 16:12:41 crc kubenswrapper[4778]: I1205 16:12:41.360083 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-flgjp" Dec 05 16:12:41 crc kubenswrapper[4778]: I1205 16:12:41.360623 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-flgjp" Dec 05 16:12:41 crc kubenswrapper[4778]: I1205 16:12:41.391851 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-flgjp" Dec 05 16:12:41 crc kubenswrapper[4778]: I1205 16:12:41.426671 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-flgjp" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.202304 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt"] Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.203903 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.205674 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-d8zml" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.213151 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt"] Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.366641 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5af7422-412e-468f-8b0c-dee56152cbfd-util\") pod \"43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt\" (UID: \"c5af7422-412e-468f-8b0c-dee56152cbfd\") " pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.366694 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4rkc\" (UniqueName: \"kubernetes.io/projected/c5af7422-412e-468f-8b0c-dee56152cbfd-kube-api-access-n4rkc\") pod \"43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt\" (UID: \"c5af7422-412e-468f-8b0c-dee56152cbfd\") " pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.366765 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5af7422-412e-468f-8b0c-dee56152cbfd-bundle\") pod \"43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt\" (UID: \"c5af7422-412e-468f-8b0c-dee56152cbfd\") " pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.468146 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5af7422-412e-468f-8b0c-dee56152cbfd-util\") pod \"43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt\" (UID: \"c5af7422-412e-468f-8b0c-dee56152cbfd\") " pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.468225 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4rkc\" (UniqueName: \"kubernetes.io/projected/c5af7422-412e-468f-8b0c-dee56152cbfd-kube-api-access-n4rkc\") pod \"43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt\" (UID: \"c5af7422-412e-468f-8b0c-dee56152cbfd\") " pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.468306 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5af7422-412e-468f-8b0c-dee56152cbfd-bundle\") pod \"43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt\" (UID: \"c5af7422-412e-468f-8b0c-dee56152cbfd\") " pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.468946 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5af7422-412e-468f-8b0c-dee56152cbfd-bundle\") pod \"43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt\" (UID: \"c5af7422-412e-468f-8b0c-dee56152cbfd\") " pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.469109 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5af7422-412e-468f-8b0c-dee56152cbfd-util\") pod \"43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt\" (UID: \"c5af7422-412e-468f-8b0c-dee56152cbfd\") " pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.497330 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4rkc\" (UniqueName: \"kubernetes.io/projected/c5af7422-412e-468f-8b0c-dee56152cbfd-kube-api-access-n4rkc\") pod \"43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt\" (UID: \"c5af7422-412e-468f-8b0c-dee56152cbfd\") " pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.523911 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:48 crc kubenswrapper[4778]: I1205 16:12:48.940244 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt"] Dec 05 16:12:48 crc kubenswrapper[4778]: W1205 16:12:48.951691 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5af7422_412e_468f_8b0c_dee56152cbfd.slice/crio-7c7aed1b99496930f2a5e3c6c2daaa04c2f0f426b1421b67531e892a907721f3 WatchSource:0}: Error finding container 7c7aed1b99496930f2a5e3c6c2daaa04c2f0f426b1421b67531e892a907721f3: Status 404 returned error can't find the container with id 7c7aed1b99496930f2a5e3c6c2daaa04c2f0f426b1421b67531e892a907721f3 Dec 05 16:12:49 crc kubenswrapper[4778]: I1205 16:12:49.426910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" event={"ID":"c5af7422-412e-468f-8b0c-dee56152cbfd","Type":"ContainerStarted","Data":"7c7aed1b99496930f2a5e3c6c2daaa04c2f0f426b1421b67531e892a907721f3"} Dec 05 16:12:50 crc kubenswrapper[4778]: I1205 16:12:50.435650 4778 generic.go:334] "Generic (PLEG): container finished" podID="c5af7422-412e-468f-8b0c-dee56152cbfd" containerID="e34cfdb21e5230d9654441cecc2375547c90a47f3afd1adde95a3e97e98bb179" exitCode=0 Dec 05 16:12:50 crc kubenswrapper[4778]: I1205 16:12:50.435691 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" event={"ID":"c5af7422-412e-468f-8b0c-dee56152cbfd","Type":"ContainerDied","Data":"e34cfdb21e5230d9654441cecc2375547c90a47f3afd1adde95a3e97e98bb179"} Dec 05 16:12:51 crc kubenswrapper[4778]: I1205 16:12:51.444214 4778 generic.go:334] "Generic (PLEG): container finished" podID="c5af7422-412e-468f-8b0c-dee56152cbfd" containerID="185146320c1e56629860b8dab12c5403054f560a332090fab09e3d1f42e7e922" exitCode=0 Dec 05 16:12:51 crc kubenswrapper[4778]: I1205 16:12:51.444259 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" event={"ID":"c5af7422-412e-468f-8b0c-dee56152cbfd","Type":"ContainerDied","Data":"185146320c1e56629860b8dab12c5403054f560a332090fab09e3d1f42e7e922"} Dec 05 16:12:52 crc kubenswrapper[4778]: I1205 16:12:52.980162 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8mpgt"] Dec 05 16:12:52 crc kubenswrapper[4778]: I1205 16:12:52.981688 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.007947 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mpgt"] Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.152039 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/358a40e3-8021-4574-9053-dee7fbaa74a4-utilities\") pod \"redhat-marketplace-8mpgt\" (UID: \"358a40e3-8021-4574-9053-dee7fbaa74a4\") " pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.152113 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/358a40e3-8021-4574-9053-dee7fbaa74a4-catalog-content\") pod \"redhat-marketplace-8mpgt\" (UID: \"358a40e3-8021-4574-9053-dee7fbaa74a4\") " pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.152135 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxx6t\" (UniqueName: \"kubernetes.io/projected/358a40e3-8021-4574-9053-dee7fbaa74a4-kube-api-access-hxx6t\") pod \"redhat-marketplace-8mpgt\" (UID: \"358a40e3-8021-4574-9053-dee7fbaa74a4\") " pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.253342 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/358a40e3-8021-4574-9053-dee7fbaa74a4-utilities\") pod \"redhat-marketplace-8mpgt\" (UID: \"358a40e3-8021-4574-9053-dee7fbaa74a4\") " pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.253461 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/358a40e3-8021-4574-9053-dee7fbaa74a4-catalog-content\") pod \"redhat-marketplace-8mpgt\" (UID: \"358a40e3-8021-4574-9053-dee7fbaa74a4\") " pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.253504 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxx6t\" (UniqueName: \"kubernetes.io/projected/358a40e3-8021-4574-9053-dee7fbaa74a4-kube-api-access-hxx6t\") pod \"redhat-marketplace-8mpgt\" (UID: \"358a40e3-8021-4574-9053-dee7fbaa74a4\") " pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.254003 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/358a40e3-8021-4574-9053-dee7fbaa74a4-utilities\") pod \"redhat-marketplace-8mpgt\" (UID: \"358a40e3-8021-4574-9053-dee7fbaa74a4\") " pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.254057 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/358a40e3-8021-4574-9053-dee7fbaa74a4-catalog-content\") pod \"redhat-marketplace-8mpgt\" (UID: \"358a40e3-8021-4574-9053-dee7fbaa74a4\") " pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.275305 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxx6t\" (UniqueName: \"kubernetes.io/projected/358a40e3-8021-4574-9053-dee7fbaa74a4-kube-api-access-hxx6t\") pod \"redhat-marketplace-8mpgt\" (UID: \"358a40e3-8021-4574-9053-dee7fbaa74a4\") " pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.296680 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.462703 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" event={"ID":"c5af7422-412e-468f-8b0c-dee56152cbfd","Type":"ContainerStarted","Data":"4dd5bbf76165c0a2d07cb332693885dbbfea6503decee2e5e5c42d803ac88835"} Dec 05 16:12:53 crc kubenswrapper[4778]: I1205 16:12:53.726026 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mpgt"] Dec 05 16:12:53 crc kubenswrapper[4778]: W1205 16:12:53.735646 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod358a40e3_8021_4574_9053_dee7fbaa74a4.slice/crio-bbbf22940fa1e6cebaa9f50c5784c1cea086da83e339776589ebcd22a6b788bf WatchSource:0}: Error finding container bbbf22940fa1e6cebaa9f50c5784c1cea086da83e339776589ebcd22a6b788bf: Status 404 returned error can't find the container with id bbbf22940fa1e6cebaa9f50c5784c1cea086da83e339776589ebcd22a6b788bf Dec 05 16:12:54 crc kubenswrapper[4778]: I1205 16:12:54.469468 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mpgt" event={"ID":"358a40e3-8021-4574-9053-dee7fbaa74a4","Type":"ContainerStarted","Data":"bbbf22940fa1e6cebaa9f50c5784c1cea086da83e339776589ebcd22a6b788bf"} Dec 05 16:12:56 crc kubenswrapper[4778]: I1205 16:12:56.483301 4778 generic.go:334] "Generic (PLEG): container finished" podID="c5af7422-412e-468f-8b0c-dee56152cbfd" containerID="4dd5bbf76165c0a2d07cb332693885dbbfea6503decee2e5e5c42d803ac88835" exitCode=0 Dec 05 16:12:56 crc kubenswrapper[4778]: I1205 16:12:56.483419 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" event={"ID":"c5af7422-412e-468f-8b0c-dee56152cbfd","Type":"ContainerDied","Data":"4dd5bbf76165c0a2d07cb332693885dbbfea6503decee2e5e5c42d803ac88835"} Dec 05 16:12:56 crc kubenswrapper[4778]: I1205 16:12:56.484967 4778 generic.go:334] "Generic (PLEG): container finished" podID="358a40e3-8021-4574-9053-dee7fbaa74a4" containerID="243ac2bfdefbdb6bc066ecfc206322ba864d6c9cae8e118f9648fac9e8fce3d7" exitCode=0 Dec 05 16:12:56 crc kubenswrapper[4778]: I1205 16:12:56.485007 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mpgt" event={"ID":"358a40e3-8021-4574-9053-dee7fbaa74a4","Type":"ContainerDied","Data":"243ac2bfdefbdb6bc066ecfc206322ba864d6c9cae8e118f9648fac9e8fce3d7"} Dec 05 16:12:57 crc kubenswrapper[4778]: I1205 16:12:57.494277 4778 generic.go:334] "Generic (PLEG): container finished" podID="358a40e3-8021-4574-9053-dee7fbaa74a4" containerID="0b452dab7a6d173affcddd48c58bb24c8a9c6b003a70fc3a202d71016522ecf0" exitCode=0 Dec 05 16:12:57 crc kubenswrapper[4778]: I1205 16:12:57.494355 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mpgt" event={"ID":"358a40e3-8021-4574-9053-dee7fbaa74a4","Type":"ContainerDied","Data":"0b452dab7a6d173affcddd48c58bb24c8a9c6b003a70fc3a202d71016522ecf0"} Dec 05 16:12:57 crc kubenswrapper[4778]: I1205 16:12:57.722541 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:57 crc kubenswrapper[4778]: I1205 16:12:57.826052 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5af7422-412e-468f-8b0c-dee56152cbfd-bundle\") pod \"c5af7422-412e-468f-8b0c-dee56152cbfd\" (UID: \"c5af7422-412e-468f-8b0c-dee56152cbfd\") " Dec 05 16:12:57 crc kubenswrapper[4778]: I1205 16:12:57.826154 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4rkc\" (UniqueName: \"kubernetes.io/projected/c5af7422-412e-468f-8b0c-dee56152cbfd-kube-api-access-n4rkc\") pod \"c5af7422-412e-468f-8b0c-dee56152cbfd\" (UID: \"c5af7422-412e-468f-8b0c-dee56152cbfd\") " Dec 05 16:12:57 crc kubenswrapper[4778]: I1205 16:12:57.826198 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5af7422-412e-468f-8b0c-dee56152cbfd-util\") pod \"c5af7422-412e-468f-8b0c-dee56152cbfd\" (UID: \"c5af7422-412e-468f-8b0c-dee56152cbfd\") " Dec 05 16:12:57 crc kubenswrapper[4778]: I1205 16:12:57.826843 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5af7422-412e-468f-8b0c-dee56152cbfd-bundle" (OuterVolumeSpecName: "bundle") pod "c5af7422-412e-468f-8b0c-dee56152cbfd" (UID: "c5af7422-412e-468f-8b0c-dee56152cbfd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:12:57 crc kubenswrapper[4778]: I1205 16:12:57.832851 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5af7422-412e-468f-8b0c-dee56152cbfd-kube-api-access-n4rkc" (OuterVolumeSpecName: "kube-api-access-n4rkc") pod "c5af7422-412e-468f-8b0c-dee56152cbfd" (UID: "c5af7422-412e-468f-8b0c-dee56152cbfd"). InnerVolumeSpecName "kube-api-access-n4rkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:12:57 crc kubenswrapper[4778]: I1205 16:12:57.836484 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5af7422-412e-468f-8b0c-dee56152cbfd-util" (OuterVolumeSpecName: "util") pod "c5af7422-412e-468f-8b0c-dee56152cbfd" (UID: "c5af7422-412e-468f-8b0c-dee56152cbfd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:12:57 crc kubenswrapper[4778]: I1205 16:12:57.927448 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5af7422-412e-468f-8b0c-dee56152cbfd-util\") on node \"crc\" DevicePath \"\"" Dec 05 16:12:57 crc kubenswrapper[4778]: I1205 16:12:57.927474 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5af7422-412e-468f-8b0c-dee56152cbfd-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:12:57 crc kubenswrapper[4778]: I1205 16:12:57.927482 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4rkc\" (UniqueName: \"kubernetes.io/projected/c5af7422-412e-468f-8b0c-dee56152cbfd-kube-api-access-n4rkc\") on node \"crc\" DevicePath \"\"" Dec 05 16:12:58 crc kubenswrapper[4778]: I1205 16:12:58.503329 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" event={"ID":"c5af7422-412e-468f-8b0c-dee56152cbfd","Type":"ContainerDied","Data":"7c7aed1b99496930f2a5e3c6c2daaa04c2f0f426b1421b67531e892a907721f3"} Dec 05 16:12:58 crc kubenswrapper[4778]: I1205 16:12:58.503989 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c7aed1b99496930f2a5e3c6c2daaa04c2f0f426b1421b67531e892a907721f3" Dec 05 16:12:58 crc kubenswrapper[4778]: I1205 16:12:58.503359 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt" Dec 05 16:12:58 crc kubenswrapper[4778]: I1205 16:12:58.505930 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mpgt" event={"ID":"358a40e3-8021-4574-9053-dee7fbaa74a4","Type":"ContainerStarted","Data":"44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554"} Dec 05 16:12:58 crc kubenswrapper[4778]: I1205 16:12:58.533914 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8mpgt" podStartSLOduration=5.098207005 podStartE2EDuration="6.533891027s" podCreationTimestamp="2025-12-05 16:12:52 +0000 UTC" firstStartedPulling="2025-12-05 16:12:56.486451674 +0000 UTC m=+1063.590248064" lastFinishedPulling="2025-12-05 16:12:57.922135706 +0000 UTC m=+1065.025932086" observedRunningTime="2025-12-05 16:12:58.530177848 +0000 UTC m=+1065.633974258" watchObservedRunningTime="2025-12-05 16:12:58.533891027 +0000 UTC m=+1065.637687417" Dec 05 16:13:03 crc kubenswrapper[4778]: I1205 16:13:03.297188 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:13:03 crc kubenswrapper[4778]: I1205 16:13:03.298504 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:13:03 crc kubenswrapper[4778]: I1205 16:13:03.339108 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:13:03 crc kubenswrapper[4778]: I1205 16:13:03.414761 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:13:03 crc kubenswrapper[4778]: I1205 16:13:03.414818 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:13:03 crc kubenswrapper[4778]: I1205 16:13:03.589114 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:13:05 crc kubenswrapper[4778]: I1205 16:13:05.772535 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mpgt"] Dec 05 16:13:06 crc kubenswrapper[4778]: I1205 16:13:06.560585 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8mpgt" podUID="358a40e3-8021-4574-9053-dee7fbaa74a4" containerName="registry-server" containerID="cri-o://44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554" gracePeriod=2 Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.070490 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl"] Dec 05 16:13:07 crc kubenswrapper[4778]: E1205 16:13:07.071173 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5af7422-412e-468f-8b0c-dee56152cbfd" containerName="util" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.071190 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5af7422-412e-468f-8b0c-dee56152cbfd" containerName="util" Dec 05 16:13:07 crc kubenswrapper[4778]: E1205 16:13:07.071216 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5af7422-412e-468f-8b0c-dee56152cbfd" containerName="pull" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.071223 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5af7422-412e-468f-8b0c-dee56152cbfd" containerName="pull" Dec 05 16:13:07 crc kubenswrapper[4778]: E1205 16:13:07.071234 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5af7422-412e-468f-8b0c-dee56152cbfd" containerName="extract" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.071241 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5af7422-412e-468f-8b0c-dee56152cbfd" containerName="extract" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.071424 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5af7422-412e-468f-8b0c-dee56152cbfd" containerName="extract" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.072025 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.073857 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-2bg5b" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.167945 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl"] Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.253216 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7jcs\" (UniqueName: \"kubernetes.io/projected/aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b-kube-api-access-m7jcs\") pod \"openstack-operator-controller-operator-d55d8cbb8-cskkl\" (UID: \"aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b\") " pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.354879 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7jcs\" (UniqueName: \"kubernetes.io/projected/aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b-kube-api-access-m7jcs\") pod \"openstack-operator-controller-operator-d55d8cbb8-cskkl\" (UID: \"aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b\") " pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.374783 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7jcs\" (UniqueName: \"kubernetes.io/projected/aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b-kube-api-access-m7jcs\") pod \"openstack-operator-controller-operator-d55d8cbb8-cskkl\" (UID: \"aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b\") " pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.439253 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.444773 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.455556 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/358a40e3-8021-4574-9053-dee7fbaa74a4-utilities\") pod \"358a40e3-8021-4574-9053-dee7fbaa74a4\" (UID: \"358a40e3-8021-4574-9053-dee7fbaa74a4\") " Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.455926 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxx6t\" (UniqueName: \"kubernetes.io/projected/358a40e3-8021-4574-9053-dee7fbaa74a4-kube-api-access-hxx6t\") pod \"358a40e3-8021-4574-9053-dee7fbaa74a4\" (UID: \"358a40e3-8021-4574-9053-dee7fbaa74a4\") " Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.455968 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/358a40e3-8021-4574-9053-dee7fbaa74a4-catalog-content\") pod \"358a40e3-8021-4574-9053-dee7fbaa74a4\" (UID: \"358a40e3-8021-4574-9053-dee7fbaa74a4\") " Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.466343 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/358a40e3-8021-4574-9053-dee7fbaa74a4-utilities" (OuterVolumeSpecName: "utilities") pod "358a40e3-8021-4574-9053-dee7fbaa74a4" (UID: "358a40e3-8021-4574-9053-dee7fbaa74a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.470846 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/358a40e3-8021-4574-9053-dee7fbaa74a4-kube-api-access-hxx6t" (OuterVolumeSpecName: "kube-api-access-hxx6t") pod "358a40e3-8021-4574-9053-dee7fbaa74a4" (UID: "358a40e3-8021-4574-9053-dee7fbaa74a4"). InnerVolumeSpecName "kube-api-access-hxx6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.486468 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/358a40e3-8021-4574-9053-dee7fbaa74a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "358a40e3-8021-4574-9053-dee7fbaa74a4" (UID: "358a40e3-8021-4574-9053-dee7fbaa74a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.557463 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxx6t\" (UniqueName: \"kubernetes.io/projected/358a40e3-8021-4574-9053-dee7fbaa74a4-kube-api-access-hxx6t\") on node \"crc\" DevicePath \"\"" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.557718 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/358a40e3-8021-4574-9053-dee7fbaa74a4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.557728 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/358a40e3-8021-4574-9053-dee7fbaa74a4-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.595682 4778 generic.go:334] "Generic (PLEG): container finished" podID="358a40e3-8021-4574-9053-dee7fbaa74a4" containerID="44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554" exitCode=0 Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.595851 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mpgt" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.595729 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mpgt" event={"ID":"358a40e3-8021-4574-9053-dee7fbaa74a4","Type":"ContainerDied","Data":"44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554"} Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.606537 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mpgt" event={"ID":"358a40e3-8021-4574-9053-dee7fbaa74a4","Type":"ContainerDied","Data":"bbbf22940fa1e6cebaa9f50c5784c1cea086da83e339776589ebcd22a6b788bf"} Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.606569 4778 scope.go:117] "RemoveContainer" containerID="44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.631625 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mpgt"] Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.636974 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mpgt"] Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.643584 4778 scope.go:117] "RemoveContainer" containerID="0b452dab7a6d173affcddd48c58bb24c8a9c6b003a70fc3a202d71016522ecf0" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.662104 4778 scope.go:117] "RemoveContainer" containerID="243ac2bfdefbdb6bc066ecfc206322ba864d6c9cae8e118f9648fac9e8fce3d7" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.675875 4778 scope.go:117] "RemoveContainer" containerID="44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554" Dec 05 16:13:07 crc kubenswrapper[4778]: E1205 16:13:07.676303 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554\": container with ID starting with 44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554 not found: ID does not exist" containerID="44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.676334 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554"} err="failed to get container status \"44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554\": rpc error: code = NotFound desc = could not find container \"44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554\": container with ID starting with 44244da560a4b93af9ee4ee8898a0ffb5eb22416f1be76da18ed7165fa7f5554 not found: ID does not exist" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.676357 4778 scope.go:117] "RemoveContainer" containerID="0b452dab7a6d173affcddd48c58bb24c8a9c6b003a70fc3a202d71016522ecf0" Dec 05 16:13:07 crc kubenswrapper[4778]: E1205 16:13:07.676616 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b452dab7a6d173affcddd48c58bb24c8a9c6b003a70fc3a202d71016522ecf0\": container with ID starting with 0b452dab7a6d173affcddd48c58bb24c8a9c6b003a70fc3a202d71016522ecf0 not found: ID does not exist" containerID="0b452dab7a6d173affcddd48c58bb24c8a9c6b003a70fc3a202d71016522ecf0" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.676643 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b452dab7a6d173affcddd48c58bb24c8a9c6b003a70fc3a202d71016522ecf0"} err="failed to get container status \"0b452dab7a6d173affcddd48c58bb24c8a9c6b003a70fc3a202d71016522ecf0\": rpc error: code = NotFound desc = could not find container \"0b452dab7a6d173affcddd48c58bb24c8a9c6b003a70fc3a202d71016522ecf0\": container with ID starting with 0b452dab7a6d173affcddd48c58bb24c8a9c6b003a70fc3a202d71016522ecf0 not found: ID does not exist" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.676655 4778 scope.go:117] "RemoveContainer" containerID="243ac2bfdefbdb6bc066ecfc206322ba864d6c9cae8e118f9648fac9e8fce3d7" Dec 05 16:13:07 crc kubenswrapper[4778]: E1205 16:13:07.676898 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"243ac2bfdefbdb6bc066ecfc206322ba864d6c9cae8e118f9648fac9e8fce3d7\": container with ID starting with 243ac2bfdefbdb6bc066ecfc206322ba864d6c9cae8e118f9648fac9e8fce3d7 not found: ID does not exist" containerID="243ac2bfdefbdb6bc066ecfc206322ba864d6c9cae8e118f9648fac9e8fce3d7" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.676936 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"243ac2bfdefbdb6bc066ecfc206322ba864d6c9cae8e118f9648fac9e8fce3d7"} err="failed to get container status \"243ac2bfdefbdb6bc066ecfc206322ba864d6c9cae8e118f9648fac9e8fce3d7\": rpc error: code = NotFound desc = could not find container \"243ac2bfdefbdb6bc066ecfc206322ba864d6c9cae8e118f9648fac9e8fce3d7\": container with ID starting with 243ac2bfdefbdb6bc066ecfc206322ba864d6c9cae8e118f9648fac9e8fce3d7 not found: ID does not exist" Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.907394 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl"] Dec 05 16:13:07 crc kubenswrapper[4778]: I1205 16:13:07.915566 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:13:08 crc kubenswrapper[4778]: I1205 16:13:08.618436 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" event={"ID":"aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b","Type":"ContainerStarted","Data":"9e42c1bf0dc44841a8bf4af2f1b2786e01b216a695f050d92f2fa53ed4ba0623"} Dec 05 16:13:09 crc kubenswrapper[4778]: I1205 16:13:09.263067 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="358a40e3-8021-4574-9053-dee7fbaa74a4" path="/var/lib/kubelet/pods/358a40e3-8021-4574-9053-dee7fbaa74a4/volumes" Dec 05 16:13:12 crc kubenswrapper[4778]: I1205 16:13:12.665208 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" event={"ID":"aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b","Type":"ContainerStarted","Data":"8aa93b891d3edfd33ea9b6b490ceae30099254c54cf2ecc6cd9780117b0d3c2e"} Dec 05 16:13:12 crc kubenswrapper[4778]: I1205 16:13:12.666479 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" Dec 05 16:13:12 crc kubenswrapper[4778]: I1205 16:13:12.702329 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" podStartSLOduration=1.8466042790000001 podStartE2EDuration="5.702305598s" podCreationTimestamp="2025-12-05 16:13:07 +0000 UTC" firstStartedPulling="2025-12-05 16:13:07.915325175 +0000 UTC m=+1075.019121555" lastFinishedPulling="2025-12-05 16:13:11.771026494 +0000 UTC m=+1078.874822874" observedRunningTime="2025-12-05 16:13:12.695856926 +0000 UTC m=+1079.799653356" watchObservedRunningTime="2025-12-05 16:13:12.702305598 +0000 UTC m=+1079.806101998" Dec 05 16:13:17 crc kubenswrapper[4778]: I1205 16:13:17.449793 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" Dec 05 16:13:33 crc kubenswrapper[4778]: I1205 16:13:33.414605 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:13:33 crc kubenswrapper[4778]: I1205 16:13:33.415138 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:13:36 crc kubenswrapper[4778]: I1205 16:13:36.963737 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn"] Dec 05 16:13:36 crc kubenswrapper[4778]: E1205 16:13:36.964600 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="358a40e3-8021-4574-9053-dee7fbaa74a4" containerName="extract-content" Dec 05 16:13:36 crc kubenswrapper[4778]: I1205 16:13:36.964616 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="358a40e3-8021-4574-9053-dee7fbaa74a4" containerName="extract-content" Dec 05 16:13:36 crc kubenswrapper[4778]: E1205 16:13:36.964636 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="358a40e3-8021-4574-9053-dee7fbaa74a4" containerName="registry-server" Dec 05 16:13:36 crc kubenswrapper[4778]: I1205 16:13:36.964644 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="358a40e3-8021-4574-9053-dee7fbaa74a4" containerName="registry-server" Dec 05 16:13:36 crc kubenswrapper[4778]: E1205 16:13:36.964655 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="358a40e3-8021-4574-9053-dee7fbaa74a4" containerName="extract-utilities" Dec 05 16:13:36 crc kubenswrapper[4778]: I1205 16:13:36.964663 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="358a40e3-8021-4574-9053-dee7fbaa74a4" containerName="extract-utilities" Dec 05 16:13:36 crc kubenswrapper[4778]: I1205 16:13:36.964802 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="358a40e3-8021-4574-9053-dee7fbaa74a4" containerName="registry-server" Dec 05 16:13:36 crc kubenswrapper[4778]: I1205 16:13:36.965633 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn" Dec 05 16:13:36 crc kubenswrapper[4778]: I1205 16:13:36.968707 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-lrp8s" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.008280 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.009393 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.011057 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-d9ggd" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.012299 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.013341 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.014495 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-9kk8j" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.016422 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.017690 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.019847 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-dkgsc" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.023583 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.031826 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.036010 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.042665 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkdnb\" (UniqueName: \"kubernetes.io/projected/eb7ca6ca-0075-46eb-9a5c-e445d06c3425-kube-api-access-dkdnb\") pod \"barbican-operator-controller-manager-7d9dfd778-krhmn\" (UID: \"eb7ca6ca-0075-46eb-9a5c-e445d06c3425\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.045042 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.057401 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.058296 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.063140 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-n4grt" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.073431 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.074429 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.076129 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-8q7cz" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.098035 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.116632 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.129175 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.130577 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.132565 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-zzd7z" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.139332 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-8htgg"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.141465 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.150188 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtfg8\" (UniqueName: \"kubernetes.io/projected/219350c9-1342-44bb-82d0-6a121ebb354b-kube-api-access-qtfg8\") pod \"heat-operator-controller-manager-5f64f6f8bb-tqqnh\" (UID: \"219350c9-1342-44bb-82d0-6a121ebb354b\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.150243 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wx7t\" (UniqueName: \"kubernetes.io/projected/a1f52513-b5c6-45ac-9cf7-42e04ba8b114-kube-api-access-8wx7t\") pod \"glance-operator-controller-manager-77987cd8cd-kwnlh\" (UID: \"a1f52513-b5c6-45ac-9cf7-42e04ba8b114\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.150292 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdd45\" (UniqueName: \"kubernetes.io/projected/fe21af78-21d5-440b-977a-1accce9c5ed3-kube-api-access-xdd45\") pod \"horizon-operator-controller-manager-68c6d99b8f-hgg7b\" (UID: \"fe21af78-21d5-440b-977a-1accce9c5ed3\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.150353 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qk7h\" (UniqueName: \"kubernetes.io/projected/5f6baea9-4909-4365-8091-d1d4acba26bd-kube-api-access-2qk7h\") pod \"designate-operator-controller-manager-78b4bc895b-7cqh2\" (UID: \"5f6baea9-4909-4365-8091-d1d4acba26bd\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.150414 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljp54\" (UniqueName: \"kubernetes.io/projected/1e0f0d87-e234-4800-847d-694de5f7dd68-kube-api-access-ljp54\") pod \"cinder-operator-controller-manager-859b6ccc6-l8jdl\" (UID: \"1e0f0d87-e234-4800-847d-694de5f7dd68\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.150450 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkdnb\" (UniqueName: \"kubernetes.io/projected/eb7ca6ca-0075-46eb-9a5c-e445d06c3425-kube-api-access-dkdnb\") pod \"barbican-operator-controller-manager-7d9dfd778-krhmn\" (UID: \"eb7ca6ca-0075-46eb-9a5c-e445d06c3425\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.151273 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.151589 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9htv7" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.162089 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.163145 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.165618 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-tvlfv" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.173459 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.191078 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.192388 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.194114 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkdnb\" (UniqueName: \"kubernetes.io/projected/eb7ca6ca-0075-46eb-9a5c-e445d06c3425-kube-api-access-dkdnb\") pod \"barbican-operator-controller-manager-7d9dfd778-krhmn\" (UID: \"eb7ca6ca-0075-46eb-9a5c-e445d06c3425\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.203107 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-hsp6l" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.230429 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.238686 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-8htgg"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.243443 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.244553 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.247836 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.249353 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-lwh2q" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.251252 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cql5\" (UniqueName: \"kubernetes.io/projected/b52d37d6-575a-4fa3-96af-4d72413e41e3-kube-api-access-4cql5\") pod \"manila-operator-controller-manager-7c79b5df47-xhg2w\" (UID: \"b52d37d6-575a-4fa3-96af-4d72413e41e3\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.251322 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qk7h\" (UniqueName: \"kubernetes.io/projected/5f6baea9-4909-4365-8091-d1d4acba26bd-kube-api-access-2qk7h\") pod \"designate-operator-controller-manager-78b4bc895b-7cqh2\" (UID: \"5f6baea9-4909-4365-8091-d1d4acba26bd\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.251397 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljp54\" (UniqueName: \"kubernetes.io/projected/1e0f0d87-e234-4800-847d-694de5f7dd68-kube-api-access-ljp54\") pod \"cinder-operator-controller-manager-859b6ccc6-l8jdl\" (UID: \"1e0f0d87-e234-4800-847d-694de5f7dd68\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.251459 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtfg8\" (UniqueName: \"kubernetes.io/projected/219350c9-1342-44bb-82d0-6a121ebb354b-kube-api-access-qtfg8\") pod \"heat-operator-controller-manager-5f64f6f8bb-tqqnh\" (UID: \"219350c9-1342-44bb-82d0-6a121ebb354b\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.251485 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wx7t\" (UniqueName: \"kubernetes.io/projected/a1f52513-b5c6-45ac-9cf7-42e04ba8b114-kube-api-access-8wx7t\") pod \"glance-operator-controller-manager-77987cd8cd-kwnlh\" (UID: \"a1f52513-b5c6-45ac-9cf7-42e04ba8b114\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.251550 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdd45\" (UniqueName: \"kubernetes.io/projected/fe21af78-21d5-440b-977a-1accce9c5ed3-kube-api-access-xdd45\") pod \"horizon-operator-controller-manager-68c6d99b8f-hgg7b\" (UID: \"fe21af78-21d5-440b-977a-1accce9c5ed3\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.251602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt248\" (UniqueName: \"kubernetes.io/projected/3ba2c006-d33f-4179-8f24-73dcd6231085-kube-api-access-qt248\") pod \"keystone-operator-controller-manager-7765d96ddf-59jpj\" (UID: \"3ba2c006-d33f-4179-8f24-73dcd6231085\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.251631 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hblrz\" (UniqueName: \"kubernetes.io/projected/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-kube-api-access-hblrz\") pod \"infra-operator-controller-manager-57548d458d-8htgg\" (UID: \"95b0b5ac-33e1-430d-a501-f429b6ccb4fe\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.251662 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert\") pod \"infra-operator-controller-manager-57548d458d-8htgg\" (UID: \"95b0b5ac-33e1-430d-a501-f429b6ccb4fe\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.251709 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gmbr\" (UniqueName: \"kubernetes.io/projected/c0add30f-d439-45c8-93e7-793a49ef95dc-kube-api-access-9gmbr\") pod \"ironic-operator-controller-manager-6c548fd776-lqlll\" (UID: \"c0add30f-d439-45c8-93e7-793a49ef95dc\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.286467 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.287658 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.307836 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljp54\" (UniqueName: \"kubernetes.io/projected/1e0f0d87-e234-4800-847d-694de5f7dd68-kube-api-access-ljp54\") pod \"cinder-operator-controller-manager-859b6ccc6-l8jdl\" (UID: \"1e0f0d87-e234-4800-847d-694de5f7dd68\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.317924 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qk7h\" (UniqueName: \"kubernetes.io/projected/5f6baea9-4909-4365-8091-d1d4acba26bd-kube-api-access-2qk7h\") pod \"designate-operator-controller-manager-78b4bc895b-7cqh2\" (UID: \"5f6baea9-4909-4365-8091-d1d4acba26bd\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.318193 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtfg8\" (UniqueName: \"kubernetes.io/projected/219350c9-1342-44bb-82d0-6a121ebb354b-kube-api-access-qtfg8\") pod \"heat-operator-controller-manager-5f64f6f8bb-tqqnh\" (UID: \"219350c9-1342-44bb-82d0-6a121ebb354b\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.322017 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wx7t\" (UniqueName: \"kubernetes.io/projected/a1f52513-b5c6-45ac-9cf7-42e04ba8b114-kube-api-access-8wx7t\") pod \"glance-operator-controller-manager-77987cd8cd-kwnlh\" (UID: \"a1f52513-b5c6-45ac-9cf7-42e04ba8b114\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.324325 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdd45\" (UniqueName: \"kubernetes.io/projected/fe21af78-21d5-440b-977a-1accce9c5ed3-kube-api-access-xdd45\") pod \"horizon-operator-controller-manager-68c6d99b8f-hgg7b\" (UID: \"fe21af78-21d5-440b-977a-1accce9c5ed3\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.329831 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.373858 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.381489 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.382909 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt248\" (UniqueName: \"kubernetes.io/projected/3ba2c006-d33f-4179-8f24-73dcd6231085-kube-api-access-qt248\") pod \"keystone-operator-controller-manager-7765d96ddf-59jpj\" (UID: \"3ba2c006-d33f-4179-8f24-73dcd6231085\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.382954 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hblrz\" (UniqueName: \"kubernetes.io/projected/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-kube-api-access-hblrz\") pod \"infra-operator-controller-manager-57548d458d-8htgg\" (UID: \"95b0b5ac-33e1-430d-a501-f429b6ccb4fe\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.382993 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9jhk\" (UniqueName: \"kubernetes.io/projected/9333a12b-dae0-41c5-a41e-a42c94f5d668-kube-api-access-j9jhk\") pod \"mariadb-operator-controller-manager-56bbcc9d85-r6qhm\" (UID: \"9333a12b-dae0-41c5-a41e-a42c94f5d668\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.383048 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert\") pod \"infra-operator-controller-manager-57548d458d-8htgg\" (UID: \"95b0b5ac-33e1-430d-a501-f429b6ccb4fe\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.383091 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gmbr\" (UniqueName: \"kubernetes.io/projected/c0add30f-d439-45c8-93e7-793a49ef95dc-kube-api-access-9gmbr\") pod \"ironic-operator-controller-manager-6c548fd776-lqlll\" (UID: \"c0add30f-d439-45c8-93e7-793a49ef95dc\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.383114 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cql5\" (UniqueName: \"kubernetes.io/projected/b52d37d6-575a-4fa3-96af-4d72413e41e3-kube-api-access-4cql5\") pod \"manila-operator-controller-manager-7c79b5df47-xhg2w\" (UID: \"b52d37d6-575a-4fa3-96af-4d72413e41e3\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.383430 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n"] Dec 05 16:13:37 crc kubenswrapper[4778]: E1205 16:13:37.385053 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 16:13:37 crc kubenswrapper[4778]: E1205 16:13:37.385191 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert podName:95b0b5ac-33e1-430d-a501-f429b6ccb4fe nodeName:}" failed. No retries permitted until 2025-12-05 16:13:37.885159002 +0000 UTC m=+1104.988955382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert") pod "infra-operator-controller-manager-57548d458d-8htgg" (UID: "95b0b5ac-33e1-430d-a501-f429b6ccb4fe") : secret "infra-operator-webhook-server-cert" not found Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.388158 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.398815 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.403966 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.405525 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-fdkzg" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.454339 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hblrz\" (UniqueName: \"kubernetes.io/projected/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-kube-api-access-hblrz\") pod \"infra-operator-controller-manager-57548d458d-8htgg\" (UID: \"95b0b5ac-33e1-430d-a501-f429b6ccb4fe\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.456816 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cql5\" (UniqueName: \"kubernetes.io/projected/b52d37d6-575a-4fa3-96af-4d72413e41e3-kube-api-access-4cql5\") pod \"manila-operator-controller-manager-7c79b5df47-xhg2w\" (UID: \"b52d37d6-575a-4fa3-96af-4d72413e41e3\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.470075 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt248\" (UniqueName: \"kubernetes.io/projected/3ba2c006-d33f-4179-8f24-73dcd6231085-kube-api-access-qt248\") pod \"keystone-operator-controller-manager-7765d96ddf-59jpj\" (UID: \"3ba2c006-d33f-4179-8f24-73dcd6231085\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.471164 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.473115 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gmbr\" (UniqueName: \"kubernetes.io/projected/c0add30f-d439-45c8-93e7-793a49ef95dc-kube-api-access-9gmbr\") pod \"ironic-operator-controller-manager-6c548fd776-lqlll\" (UID: \"c0add30f-d439-45c8-93e7-793a49ef95dc\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.473693 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.483293 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.507144 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9jhk\" (UniqueName: \"kubernetes.io/projected/9333a12b-dae0-41c5-a41e-a42c94f5d668-kube-api-access-j9jhk\") pod \"mariadb-operator-controller-manager-56bbcc9d85-r6qhm\" (UID: \"9333a12b-dae0-41c5-a41e-a42c94f5d668\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.507991 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5frq7\" (UniqueName: \"kubernetes.io/projected/aee0787e-b460-4811-aaf5-3ad30d1ca069-kube-api-access-5frq7\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-lvc5n\" (UID: \"aee0787e-b460-4811-aaf5-3ad30d1ca069\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.515585 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-h7bm4" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.552645 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.557287 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9jhk\" (UniqueName: \"kubernetes.io/projected/9333a12b-dae0-41c5-a41e-a42c94f5d668-kube-api-access-j9jhk\") pod \"mariadb-operator-controller-manager-56bbcc9d85-r6qhm\" (UID: \"9333a12b-dae0-41c5-a41e-a42c94f5d668\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.563013 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.595059 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.606014 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.611063 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z774k\" (UniqueName: \"kubernetes.io/projected/749eef58-2e11-4af9-80d5-b4ab23f257cc-kube-api-access-z774k\") pod \"nova-operator-controller-manager-697bc559fc-z2c7d\" (UID: \"749eef58-2e11-4af9-80d5-b4ab23f257cc\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.611127 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5frq7\" (UniqueName: \"kubernetes.io/projected/aee0787e-b460-4811-aaf5-3ad30d1ca069-kube-api-access-5frq7\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-lvc5n\" (UID: \"aee0787e-b460-4811-aaf5-3ad30d1ca069\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.638660 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.639809 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.643732 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-85b9b" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.697040 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5frq7\" (UniqueName: \"kubernetes.io/projected/aee0787e-b460-4811-aaf5-3ad30d1ca069-kube-api-access-5frq7\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-lvc5n\" (UID: \"aee0787e-b460-4811-aaf5-3ad30d1ca069\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.712238 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z774k\" (UniqueName: \"kubernetes.io/projected/749eef58-2e11-4af9-80d5-b4ab23f257cc-kube-api-access-z774k\") pod \"nova-operator-controller-manager-697bc559fc-z2c7d\" (UID: \"749eef58-2e11-4af9-80d5-b4ab23f257cc\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.729504 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.739330 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.788429 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.789609 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.796747 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.798337 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.802129 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.802819 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-njd74" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.803236 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-t6dwc" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.808998 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z774k\" (UniqueName: \"kubernetes.io/projected/749eef58-2e11-4af9-80d5-b4ab23f257cc-kube-api-access-z774k\") pod \"nova-operator-controller-manager-697bc559fc-z2c7d\" (UID: \"749eef58-2e11-4af9-80d5-b4ab23f257cc\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.816954 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqs89\" (UniqueName: \"kubernetes.io/projected/4b11c75e-cbea-4850-b090-a231f3908b53-kube-api-access-kqs89\") pod \"octavia-operator-controller-manager-998648c74-jx9x4\" (UID: \"4b11c75e-cbea-4850-b090-a231f3908b53\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.817332 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.822443 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.835776 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.837147 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.841666 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4828x"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.841690 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bbl87" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.843232 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.848120 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.853757 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-jfs85" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.861217 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.893349 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.894523 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.895054 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.897874 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4dpbt" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.905702 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4828x"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.915432 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.918833 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rktz\" (UniqueName: \"kubernetes.io/projected/662da656-0ba6-4ff7-85bd-6739ad5c5100-kube-api-access-7rktz\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw\" (UID: \"662da656-0ba6-4ff7-85bd-6739ad5c5100\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.918896 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqs89\" (UniqueName: \"kubernetes.io/projected/4b11c75e-cbea-4850-b090-a231f3908b53-kube-api-access-kqs89\") pod \"octavia-operator-controller-manager-998648c74-jx9x4\" (UID: \"4b11c75e-cbea-4850-b090-a231f3908b53\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.918948 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert\") pod \"infra-operator-controller-manager-57548d458d-8htgg\" (UID: \"95b0b5ac-33e1-430d-a501-f429b6ccb4fe\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.918973 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw\" (UID: \"662da656-0ba6-4ff7-85bd-6739ad5c5100\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.919017 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcpgh\" (UniqueName: \"kubernetes.io/projected/a2f059f8-ee96-4e29-a00f-bee69430c802-kube-api-access-qcpgh\") pod \"ovn-operator-controller-manager-b6456fdb6-jxckc\" (UID: \"a2f059f8-ee96-4e29-a00f-bee69430c802\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" Dec 05 16:13:37 crc kubenswrapper[4778]: E1205 16:13:37.919415 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 16:13:37 crc kubenswrapper[4778]: E1205 16:13:37.919462 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert podName:95b0b5ac-33e1-430d-a501-f429b6ccb4fe nodeName:}" failed. No retries permitted until 2025-12-05 16:13:38.919445344 +0000 UTC m=+1106.023241724 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert") pod "infra-operator-controller-manager-57548d458d-8htgg" (UID: "95b0b5ac-33e1-430d-a501-f429b6ccb4fe") : secret "infra-operator-webhook-server-cert" not found Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.938079 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-mfkct"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.939509 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-mfkct" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.946633 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqs89\" (UniqueName: \"kubernetes.io/projected/4b11c75e-cbea-4850-b090-a231f3908b53-kube-api-access-kqs89\") pod \"octavia-operator-controller-manager-998648c74-jx9x4\" (UID: \"4b11c75e-cbea-4850-b090-a231f3908b53\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.951622 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-6mnzq" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.960467 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-mfkct"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.966991 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp"] Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.968940 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.973060 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-ch9c7" Dec 05 16:13:37 crc kubenswrapper[4778]: I1205 16:13:37.975088 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp"] Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.020337 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hmmd\" (UniqueName: \"kubernetes.io/projected/996d89ac-bb27-41d3-9ea8-171d71c585e2-kube-api-access-5hmmd\") pod \"swift-operator-controller-manager-5f8c65bbfc-rcr6s\" (UID: \"996d89ac-bb27-41d3-9ea8-171d71c585e2\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.026401 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rktz\" (UniqueName: \"kubernetes.io/projected/662da656-0ba6-4ff7-85bd-6739ad5c5100-kube-api-access-7rktz\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw\" (UID: \"662da656-0ba6-4ff7-85bd-6739ad5c5100\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.026518 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c8pz\" (UniqueName: \"kubernetes.io/projected/0b521841-47d8-461f-a765-c9b7974bb4b7-kube-api-access-9c8pz\") pod \"placement-operator-controller-manager-78f8948974-4828x\" (UID: \"0b521841-47d8-461f-a765-c9b7974bb4b7\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.022466 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.026812 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw\" (UID: \"662da656-0ba6-4ff7-85bd-6739ad5c5100\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.026910 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.026968 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert podName:662da656-0ba6-4ff7-85bd-6739ad5c5100 nodeName:}" failed. No retries permitted until 2025-12-05 16:13:38.526953881 +0000 UTC m=+1105.630750261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" (UID: "662da656-0ba6-4ff7-85bd-6739ad5c5100") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.027087 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s254b\" (UniqueName: \"kubernetes.io/projected/862bfade-07c6-405c-bbca-e96341188a5c-kube-api-access-s254b\") pod \"telemetry-operator-controller-manager-76cc84c6bb-hc7pn\" (UID: \"862bfade-07c6-405c-bbca-e96341188a5c\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.029584 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcpgh\" (UniqueName: \"kubernetes.io/projected/a2f059f8-ee96-4e29-a00f-bee69430c802-kube-api-access-qcpgh\") pod \"ovn-operator-controller-manager-b6456fdb6-jxckc\" (UID: \"a2f059f8-ee96-4e29-a00f-bee69430c802\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.062205 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcpgh\" (UniqueName: \"kubernetes.io/projected/a2f059f8-ee96-4e29-a00f-bee69430c802-kube-api-access-qcpgh\") pod \"ovn-operator-controller-manager-b6456fdb6-jxckc\" (UID: \"a2f059f8-ee96-4e29-a00f-bee69430c802\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.062727 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rktz\" (UniqueName: \"kubernetes.io/projected/662da656-0ba6-4ff7-85bd-6739ad5c5100-kube-api-access-7rktz\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw\" (UID: \"662da656-0ba6-4ff7-85bd-6739ad5c5100\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.081354 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw"] Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.113543 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.121297 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.121695 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.122432 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-p2j6f" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.135626 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s254b\" (UniqueName: \"kubernetes.io/projected/862bfade-07c6-405c-bbca-e96341188a5c-kube-api-access-s254b\") pod \"telemetry-operator-controller-manager-76cc84c6bb-hc7pn\" (UID: \"862bfade-07c6-405c-bbca-e96341188a5c\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.136435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkg5m\" (UniqueName: \"kubernetes.io/projected/65f4316e-afb7-4a97-b58e-89653f55ff4a-kube-api-access-fkg5m\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.136651 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.136982 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hmmd\" (UniqueName: \"kubernetes.io/projected/996d89ac-bb27-41d3-9ea8-171d71c585e2-kube-api-access-5hmmd\") pod \"swift-operator-controller-manager-5f8c65bbfc-rcr6s\" (UID: \"996d89ac-bb27-41d3-9ea8-171d71c585e2\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.137262 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c8pz\" (UniqueName: \"kubernetes.io/projected/0b521841-47d8-461f-a765-c9b7974bb4b7-kube-api-access-9c8pz\") pod \"placement-operator-controller-manager-78f8948974-4828x\" (UID: \"0b521841-47d8-461f-a765-c9b7974bb4b7\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.138251 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.138471 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7w42\" (UniqueName: \"kubernetes.io/projected/cd874cca-18d0-4bcc-a436-00d6e9bceb9e-kube-api-access-b7w42\") pod \"watcher-operator-controller-manager-6447f74d5-prnmp\" (UID: \"cd874cca-18d0-4bcc-a436-00d6e9bceb9e\") " pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.138607 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95k24\" (UniqueName: \"kubernetes.io/projected/33e87b02-f57e-47bb-934b-159e59f2d7f5-kube-api-access-95k24\") pod \"test-operator-controller-manager-5854674fcc-mfkct\" (UID: \"33e87b02-f57e-47bb-934b-159e59f2d7f5\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-mfkct" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.140170 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw"] Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.167343 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s254b\" (UniqueName: \"kubernetes.io/projected/862bfade-07c6-405c-bbca-e96341188a5c-kube-api-access-s254b\") pod \"telemetry-operator-controller-manager-76cc84c6bb-hc7pn\" (UID: \"862bfade-07c6-405c-bbca-e96341188a5c\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.169247 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hmmd\" (UniqueName: \"kubernetes.io/projected/996d89ac-bb27-41d3-9ea8-171d71c585e2-kube-api-access-5hmmd\") pod \"swift-operator-controller-manager-5f8c65bbfc-rcr6s\" (UID: \"996d89ac-bb27-41d3-9ea8-171d71c585e2\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.170913 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c8pz\" (UniqueName: \"kubernetes.io/projected/0b521841-47d8-461f-a765-c9b7974bb4b7-kube-api-access-9c8pz\") pod \"placement-operator-controller-manager-78f8948974-4828x\" (UID: \"0b521841-47d8-461f-a765-c9b7974bb4b7\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.189475 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727"] Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.193953 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.203897 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727"] Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.211053 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-gxw2g" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.239848 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95k24\" (UniqueName: \"kubernetes.io/projected/33e87b02-f57e-47bb-934b-159e59f2d7f5-kube-api-access-95k24\") pod \"test-operator-controller-manager-5854674fcc-mfkct\" (UID: \"33e87b02-f57e-47bb-934b-159e59f2d7f5\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-mfkct" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.239953 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkg5m\" (UniqueName: \"kubernetes.io/projected/65f4316e-afb7-4a97-b58e-89653f55ff4a-kube-api-access-fkg5m\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.239977 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.244707 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.244777 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7w42\" (UniqueName: \"kubernetes.io/projected/cd874cca-18d0-4bcc-a436-00d6e9bceb9e-kube-api-access-b7w42\") pod \"watcher-operator-controller-manager-6447f74d5-prnmp\" (UID: \"cd874cca-18d0-4bcc-a436-00d6e9bceb9e\") " pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.244719 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.245038 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs podName:65f4316e-afb7-4a97-b58e-89653f55ff4a nodeName:}" failed. No retries permitted until 2025-12-05 16:13:38.745017028 +0000 UTC m=+1105.848813408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs") pod "openstack-operator-controller-manager-57f879d6c4-clslw" (UID: "65f4316e-afb7-4a97-b58e-89653f55ff4a") : secret "webhook-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.245394 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.245511 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs podName:65f4316e-afb7-4a97-b58e-89653f55ff4a nodeName:}" failed. No retries permitted until 2025-12-05 16:13:38.74549722 +0000 UTC m=+1105.849293600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs") pod "openstack-operator-controller-manager-57f879d6c4-clslw" (UID: "65f4316e-afb7-4a97-b58e-89653f55ff4a") : secret "metrics-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.281512 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkg5m\" (UniqueName: \"kubernetes.io/projected/65f4316e-afb7-4a97-b58e-89653f55ff4a-kube-api-access-fkg5m\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.281914 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95k24\" (UniqueName: \"kubernetes.io/projected/33e87b02-f57e-47bb-934b-159e59f2d7f5-kube-api-access-95k24\") pod \"test-operator-controller-manager-5854674fcc-mfkct\" (UID: \"33e87b02-f57e-47bb-934b-159e59f2d7f5\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-mfkct" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.321629 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7w42\" (UniqueName: \"kubernetes.io/projected/cd874cca-18d0-4bcc-a436-00d6e9bceb9e-kube-api-access-b7w42\") pod \"watcher-operator-controller-manager-6447f74d5-prnmp\" (UID: \"cd874cca-18d0-4bcc-a436-00d6e9bceb9e\") " pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.337803 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.345965 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvnzq\" (UniqueName: \"kubernetes.io/projected/48f6bc28-3426-42fb-9498-1280593297ea-kube-api-access-tvnzq\") pod \"rabbitmq-cluster-operator-manager-668c99d594-b6727\" (UID: \"48f6bc28-3426-42fb-9498-1280593297ea\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.353788 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.420634 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.449259 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvnzq\" (UniqueName: \"kubernetes.io/projected/48f6bc28-3426-42fb-9498-1280593297ea-kube-api-access-tvnzq\") pod \"rabbitmq-cluster-operator-manager-668c99d594-b6727\" (UID: \"48f6bc28-3426-42fb-9498-1280593297ea\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.468546 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.476784 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvnzq\" (UniqueName: \"kubernetes.io/projected/48f6bc28-3426-42fb-9498-1280593297ea-kube-api-access-tvnzq\") pod \"rabbitmq-cluster-operator-manager-668c99d594-b6727\" (UID: \"48f6bc28-3426-42fb-9498-1280593297ea\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.490537 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-mfkct" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.531146 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.551066 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw\" (UID: \"662da656-0ba6-4ff7-85bd-6739ad5c5100\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.551260 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.551348 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert podName:662da656-0ba6-4ff7-85bd-6739ad5c5100 nodeName:}" failed. No retries permitted until 2025-12-05 16:13:39.551319409 +0000 UTC m=+1106.655115789 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" (UID: "662da656-0ba6-4ff7-85bd-6739ad5c5100") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.672930 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.756224 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.756404 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.757341 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs podName:65f4316e-afb7-4a97-b58e-89653f55ff4a nodeName:}" failed. No retries permitted until 2025-12-05 16:13:39.757321495 +0000 UTC m=+1106.861117875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs") pod "openstack-operator-controller-manager-57f879d6c4-clslw" (UID: "65f4316e-afb7-4a97-b58e-89653f55ff4a") : secret "metrics-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.757931 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.758064 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.758096 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs podName:65f4316e-afb7-4a97-b58e-89653f55ff4a nodeName:}" failed. No retries permitted until 2025-12-05 16:13:39.758085275 +0000 UTC m=+1106.861881655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs") pod "openstack-operator-controller-manager-57f879d6c4-clslw" (UID: "65f4316e-afb7-4a97-b58e-89653f55ff4a") : secret "webhook-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: I1205 16:13:38.960707 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert\") pod \"infra-operator-controller-manager-57548d458d-8htgg\" (UID: \"95b0b5ac-33e1-430d-a501-f429b6ccb4fe\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.961565 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 16:13:38 crc kubenswrapper[4778]: E1205 16:13:38.961631 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert podName:95b0b5ac-33e1-430d-a501-f429b6ccb4fe nodeName:}" failed. No retries permitted until 2025-12-05 16:13:40.961613366 +0000 UTC m=+1108.065409746 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert") pod "infra-operator-controller-manager-57548d458d-8htgg" (UID: "95b0b5ac-33e1-430d-a501-f429b6ccb4fe") : secret "infra-operator-webhook-server-cert" not found Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.086466 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn"] Dec 05 16:13:39 crc kubenswrapper[4778]: W1205 16:13:39.108495 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e0f0d87_e234_4800_847d_694de5f7dd68.slice/crio-4056d34888ba4e1b3ea42718c76642aca22bd6ca75ef65f28ca43925d543dd2b WatchSource:0}: Error finding container 4056d34888ba4e1b3ea42718c76642aca22bd6ca75ef65f28ca43925d543dd2b: Status 404 returned error can't find the container with id 4056d34888ba4e1b3ea42718c76642aca22bd6ca75ef65f28ca43925d543dd2b Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.120106 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.428328 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.447075 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.469707 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.482183 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.496424 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.529176 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh"] Dec 05 16:13:39 crc kubenswrapper[4778]: W1205 16:13:39.541928 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod219350c9_1342_44bb_82d0_6a121ebb354b.slice/crio-b904db64d441a9229a8d467e0efd6acdc21982858da9b70d80e9485a4eac21a6 WatchSource:0}: Error finding container b904db64d441a9229a8d467e0efd6acdc21982858da9b70d80e9485a4eac21a6: Status 404 returned error can't find the container with id b904db64d441a9229a8d467e0efd6acdc21982858da9b70d80e9485a4eac21a6 Dec 05 16:13:39 crc kubenswrapper[4778]: W1205 16:13:39.543925 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee0787e_b460_4811_aaf5_3ad30d1ca069.slice/crio-c6bea6181d23a2cab5c775f9c83cc0fbda496aab81d8a01f27711fa8f9b4637b WatchSource:0}: Error finding container c6bea6181d23a2cab5c775f9c83cc0fbda496aab81d8a01f27711fa8f9b4637b: Status 404 returned error can't find the container with id c6bea6181d23a2cab5c775f9c83cc0fbda496aab81d8a01f27711fa8f9b4637b Dec 05 16:13:39 crc kubenswrapper[4778]: W1205 16:13:39.545227 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod749eef58_2e11_4af9_80d5_b4ab23f257cc.slice/crio-5d75d37016b04770f8b9f7314214afb4f41562b137b7670074d6bc1c49553e33 WatchSource:0}: Error finding container 5d75d37016b04770f8b9f7314214afb4f41562b137b7670074d6bc1c49553e33: Status 404 returned error can't find the container with id 5d75d37016b04770f8b9f7314214afb4f41562b137b7670074d6bc1c49553e33 Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.553962 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b"] Dec 05 16:13:39 crc kubenswrapper[4778]: W1205 16:13:39.555707 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f6baea9_4909_4365_8091_d1d4acba26bd.slice/crio-3db59a25bdc09479e162ab11832a447808cd53f743ad28071d404883841e782d WatchSource:0}: Error finding container 3db59a25bdc09479e162ab11832a447808cd53f743ad28071d404883841e782d: Status 404 returned error can't find the container with id 3db59a25bdc09479e162ab11832a447808cd53f743ad28071d404883841e782d Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.569172 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.574999 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw\" (UID: \"662da656-0ba6-4ff7-85bd-6739ad5c5100\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.575198 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.575250 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert podName:662da656-0ba6-4ff7-85bd-6739ad5c5100 nodeName:}" failed. No retries permitted until 2025-12-05 16:13:41.575232616 +0000 UTC m=+1108.679028996 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" (UID: "662da656-0ba6-4ff7-85bd-6739ad5c5100") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.581564 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2qk7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-7cqh2_openstack-operators(5f6baea9-4909-4365-8091-d1d4acba26bd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.583336 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2qk7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-7cqh2_openstack-operators(5f6baea9-4909-4365-8091-d1d4acba26bd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.584587 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" podUID="5f6baea9-4909-4365-8091-d1d4acba26bd" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.590652 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s254b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-hc7pn_openstack-operators(862bfade-07c6-405c-bbca-e96341188a5c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.591177 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n"] Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.592771 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s254b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-hc7pn_openstack-operators(862bfade-07c6-405c-bbca-e96341188a5c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.592912 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9c8pz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-4828x_openstack-operators(0b521841-47d8-461f-a765-c9b7974bb4b7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.594135 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" podUID="862bfade-07c6-405c-bbca-e96341188a5c" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.594580 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9c8pz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-4828x_openstack-operators(0b521841-47d8-461f-a765-c9b7974bb4b7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.596426 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" podUID="0b521841-47d8-461f-a765-c9b7974bb4b7" Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.602334 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.611091 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.615550 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.622744 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4828x"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.778178 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.778287 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.778484 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.778555 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs podName:65f4316e-afb7-4a97-b58e-89653f55ff4a nodeName:}" failed. No retries permitted until 2025-12-05 16:13:41.7785231 +0000 UTC m=+1108.882319480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs") pod "openstack-operator-controller-manager-57f879d6c4-clslw" (UID: "65f4316e-afb7-4a97-b58e-89653f55ff4a") : secret "metrics-server-cert" not found Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.778966 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.779017 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs podName:65f4316e-afb7-4a97-b58e-89653f55ff4a nodeName:}" failed. No retries permitted until 2025-12-05 16:13:41.778990042 +0000 UTC m=+1108.882786422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs") pod "openstack-operator-controller-manager-57f879d6c4-clslw" (UID: "65f4316e-afb7-4a97-b58e-89653f55ff4a") : secret "webhook-server-cert" not found Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.809434 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-mfkct"] Dec 05 16:13:39 crc kubenswrapper[4778]: W1205 16:13:39.814554 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e87b02_f57e_47bb_934b_159e59f2d7f5.slice/crio-979d55189b5e02b3e030f24c3a43ab47f50e5aad97626989554bd41524e24c73 WatchSource:0}: Error finding container 979d55189b5e02b3e030f24c3a43ab47f50e5aad97626989554bd41524e24c73: Status 404 returned error can't find the container with id 979d55189b5e02b3e030f24c3a43ab47f50e5aad97626989554bd41524e24c73 Dec 05 16:13:39 crc kubenswrapper[4778]: W1205 16:13:39.847577 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod996d89ac_bb27_41d3_9ea8_171d71c585e2.slice/crio-c6a365ea7e835a6c3a62374f970ab7065460c0493b613a3e0c57108e5431b7b6 WatchSource:0}: Error finding container c6a365ea7e835a6c3a62374f970ab7065460c0493b613a3e0c57108e5431b7b6: Status 404 returned error can't find the container with id c6a365ea7e835a6c3a62374f970ab7065460c0493b613a3e0c57108e5431b7b6 Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.849084 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s"] Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.849880 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5hmmd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-rcr6s_openstack-operators(996d89ac-bb27-41d3-9ea8-171d71c585e2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.851165 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tvnzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-b6727_openstack-operators(48f6bc28-3426-42fb-9498-1280593297ea): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.851866 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5hmmd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-rcr6s_openstack-operators(996d89ac-bb27-41d3-9ea8-171d71c585e2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.852426 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" podUID="48f6bc28-3426-42fb-9498-1280593297ea" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.852935 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" podUID="996d89ac-bb27-41d3-9ea8-171d71c585e2" Dec 05 16:13:39 crc kubenswrapper[4778]: W1205 16:13:39.856494 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2f059f8_ee96_4e29_a00f_bee69430c802.slice/crio-eaa793ef057205a0aef1e0f88f2b119881a34f26cb6dcb88865dbeeaa92cf450 WatchSource:0}: Error finding container eaa793ef057205a0aef1e0f88f2b119881a34f26cb6dcb88865dbeeaa92cf450: Status 404 returned error can't find the container with id eaa793ef057205a0aef1e0f88f2b119881a34f26cb6dcb88865dbeeaa92cf450 Dec 05 16:13:39 crc kubenswrapper[4778]: W1205 16:13:39.857720 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd874cca_18d0_4bcc_a436_00d6e9bceb9e.slice/crio-f0dc0d48cf20e7782cc5c0c336ed21f2a3c1b59306fba53e75b81b4e19c69f5c WatchSource:0}: Error finding container f0dc0d48cf20e7782cc5c0c336ed21f2a3c1b59306fba53e75b81b4e19c69f5c: Status 404 returned error can't find the container with id f0dc0d48cf20e7782cc5c0c336ed21f2a3c1b59306fba53e75b81b4e19c69f5c Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.865173 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qcpgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-jxckc_openstack-operators(a2f059f8-ee96-4e29-a00f-bee69430c802): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.865636 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.36:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b7w42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6447f74d5-prnmp_openstack-operators(cd874cca-18d0-4bcc-a436-00d6e9bceb9e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.868492 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727"] Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.869870 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b7w42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6447f74d5-prnmp_openstack-operators(cd874cca-18d0-4bcc-a436-00d6e9bceb9e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.869876 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qcpgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-jxckc_openstack-operators(a2f059f8-ee96-4e29-a00f-bee69430c802): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.871334 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" podUID="a2f059f8-ee96-4e29-a00f-bee69430c802" Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.871581 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.883719 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" event={"ID":"219350c9-1342-44bb-82d0-6a121ebb354b","Type":"ContainerStarted","Data":"b904db64d441a9229a8d467e0efd6acdc21982858da9b70d80e9485a4eac21a6"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.886033 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.889938 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" event={"ID":"4b11c75e-cbea-4850-b090-a231f3908b53","Type":"ContainerStarted","Data":"62d6b52eb66bcea4147995a253bc504b3e660f14a097ad665ea7d7e618139805"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.894238 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp"] Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.895596 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n" event={"ID":"aee0787e-b460-4811-aaf5-3ad30d1ca069","Type":"ContainerStarted","Data":"c6bea6181d23a2cab5c775f9c83cc0fbda496aab81d8a01f27711fa8f9b4637b"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.900564 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" event={"ID":"a2f059f8-ee96-4e29-a00f-bee69430c802","Type":"ContainerStarted","Data":"eaa793ef057205a0aef1e0f88f2b119881a34f26cb6dcb88865dbeeaa92cf450"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.903756 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" event={"ID":"9333a12b-dae0-41c5-a41e-a42c94f5d668","Type":"ContainerStarted","Data":"9f241774e05e843b5f353fdf56e74f6712a9036eee76f1e699cff9e72465d4fa"} Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.903827 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" podUID="a2f059f8-ee96-4e29-a00f-bee69430c802" Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.905039 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" event={"ID":"48f6bc28-3426-42fb-9498-1280593297ea","Type":"ContainerStarted","Data":"7d6ce5c5c8815f73b85d6c96ac9f7c94d8a31d3cfa0b5607c9609fc10609b5fc"} Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.906683 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" podUID="48f6bc28-3426-42fb-9498-1280593297ea" Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.908453 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b" event={"ID":"fe21af78-21d5-440b-977a-1accce9c5ed3","Type":"ContainerStarted","Data":"a509e0c33f1edfd7ef724bacde7ad0e5673fa206558c1df33e443108a94a442a"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.909814 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" event={"ID":"a1f52513-b5c6-45ac-9cf7-42e04ba8b114","Type":"ContainerStarted","Data":"df11b75b6ca6dc9f86f4e409001c0376f0da926eced3db7396be2dfa96a7c196"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.910621 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll" event={"ID":"c0add30f-d439-45c8-93e7-793a49ef95dc","Type":"ContainerStarted","Data":"0eedd2068215cd873f4f3f4128dcc0a9cb8b3002be012b5d13db703e1a86a9b9"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.912067 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" event={"ID":"862bfade-07c6-405c-bbca-e96341188a5c","Type":"ContainerStarted","Data":"dfca08c9a6fa2dc8400c6ebd394efda625dcdd928cd44851e5ee1c5059d4934b"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.914496 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn" event={"ID":"eb7ca6ca-0075-46eb-9a5c-e445d06c3425","Type":"ContainerStarted","Data":"725e32a06c757dae93b15365e84b813ca9d21d727d55fe7c1f7dd983c0929b3a"} Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.915258 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" podUID="862bfade-07c6-405c-bbca-e96341188a5c" Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.918610 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-mfkct" event={"ID":"33e87b02-f57e-47bb-934b-159e59f2d7f5","Type":"ContainerStarted","Data":"979d55189b5e02b3e030f24c3a43ab47f50e5aad97626989554bd41524e24c73"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.924287 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" event={"ID":"0b521841-47d8-461f-a765-c9b7974bb4b7","Type":"ContainerStarted","Data":"a9230d13b525108a0d25d439161dafac0c836a28595ed429bd5140fdb567d638"} Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.931340 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" podUID="0b521841-47d8-461f-a765-c9b7974bb4b7" Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.931736 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" event={"ID":"749eef58-2e11-4af9-80d5-b4ab23f257cc","Type":"ContainerStarted","Data":"5d75d37016b04770f8b9f7314214afb4f41562b137b7670074d6bc1c49553e33"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.932825 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" event={"ID":"3ba2c006-d33f-4179-8f24-73dcd6231085","Type":"ContainerStarted","Data":"5aaabd1fa00d3a19bcaf0a40b2906bdf1742fcccee3a12bf7af1a725ef2e8e2f"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.933597 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" event={"ID":"cd874cca-18d0-4bcc-a436-00d6e9bceb9e","Type":"ContainerStarted","Data":"f0dc0d48cf20e7782cc5c0c336ed21f2a3c1b59306fba53e75b81b4e19c69f5c"} Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.935216 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.935687 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" event={"ID":"5f6baea9-4909-4365-8091-d1d4acba26bd","Type":"ContainerStarted","Data":"3db59a25bdc09479e162ab11832a447808cd53f743ad28071d404883841e782d"} Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.938203 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" podUID="5f6baea9-4909-4365-8091-d1d4acba26bd" Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.938414 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl" event={"ID":"1e0f0d87-e234-4800-847d-694de5f7dd68","Type":"ContainerStarted","Data":"4056d34888ba4e1b3ea42718c76642aca22bd6ca75ef65f28ca43925d543dd2b"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.941617 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w" event={"ID":"b52d37d6-575a-4fa3-96af-4d72413e41e3","Type":"ContainerStarted","Data":"737e25fb8fb408a7456a6478d2bd03aec0bc992e3536430ff9e3344ebf37088d"} Dec 05 16:13:39 crc kubenswrapper[4778]: I1205 16:13:39.946837 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" event={"ID":"996d89ac-bb27-41d3-9ea8-171d71c585e2","Type":"ContainerStarted","Data":"c6a365ea7e835a6c3a62374f970ab7065460c0493b613a3e0c57108e5431b7b6"} Dec 05 16:13:39 crc kubenswrapper[4778]: E1205 16:13:39.950460 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" podUID="996d89ac-bb27-41d3-9ea8-171d71c585e2" Dec 05 16:13:40 crc kubenswrapper[4778]: E1205 16:13:40.955025 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" podUID="996d89ac-bb27-41d3-9ea8-171d71c585e2" Dec 05 16:13:40 crc kubenswrapper[4778]: E1205 16:13:40.955115 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" podUID="0b521841-47d8-461f-a765-c9b7974bb4b7" Dec 05 16:13:40 crc kubenswrapper[4778]: E1205 16:13:40.955166 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" podUID="a2f059f8-ee96-4e29-a00f-bee69430c802" Dec 05 16:13:40 crc kubenswrapper[4778]: E1205 16:13:40.955188 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" podUID="48f6bc28-3426-42fb-9498-1280593297ea" Dec 05 16:13:40 crc kubenswrapper[4778]: E1205 16:13:40.955514 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" podUID="5f6baea9-4909-4365-8091-d1d4acba26bd" Dec 05 16:13:40 crc kubenswrapper[4778]: E1205 16:13:40.956458 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" podUID="862bfade-07c6-405c-bbca-e96341188a5c" Dec 05 16:13:40 crc kubenswrapper[4778]: E1205 16:13:40.959458 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" Dec 05 16:13:41 crc kubenswrapper[4778]: I1205 16:13:41.004383 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert\") pod \"infra-operator-controller-manager-57548d458d-8htgg\" (UID: \"95b0b5ac-33e1-430d-a501-f429b6ccb4fe\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:41 crc kubenswrapper[4778]: E1205 16:13:41.004542 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 16:13:41 crc kubenswrapper[4778]: E1205 16:13:41.004599 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert podName:95b0b5ac-33e1-430d-a501-f429b6ccb4fe nodeName:}" failed. No retries permitted until 2025-12-05 16:13:45.00458397 +0000 UTC m=+1112.108380350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert") pod "infra-operator-controller-manager-57548d458d-8htgg" (UID: "95b0b5ac-33e1-430d-a501-f429b6ccb4fe") : secret "infra-operator-webhook-server-cert" not found Dec 05 16:13:41 crc kubenswrapper[4778]: I1205 16:13:41.611336 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw\" (UID: \"662da656-0ba6-4ff7-85bd-6739ad5c5100\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:41 crc kubenswrapper[4778]: E1205 16:13:41.611493 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:13:41 crc kubenswrapper[4778]: E1205 16:13:41.611568 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert podName:662da656-0ba6-4ff7-85bd-6739ad5c5100 nodeName:}" failed. No retries permitted until 2025-12-05 16:13:45.611549634 +0000 UTC m=+1112.715346014 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" (UID: "662da656-0ba6-4ff7-85bd-6739ad5c5100") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:13:41 crc kubenswrapper[4778]: I1205 16:13:41.813690 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:41 crc kubenswrapper[4778]: I1205 16:13:41.813813 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:41 crc kubenswrapper[4778]: E1205 16:13:41.813886 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 16:13:41 crc kubenswrapper[4778]: E1205 16:13:41.813920 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 16:13:41 crc kubenswrapper[4778]: E1205 16:13:41.813959 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs podName:65f4316e-afb7-4a97-b58e-89653f55ff4a nodeName:}" failed. No retries permitted until 2025-12-05 16:13:45.813937713 +0000 UTC m=+1112.917734093 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs") pod "openstack-operator-controller-manager-57f879d6c4-clslw" (UID: "65f4316e-afb7-4a97-b58e-89653f55ff4a") : secret "metrics-server-cert" not found Dec 05 16:13:41 crc kubenswrapper[4778]: E1205 16:13:41.813976 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs podName:65f4316e-afb7-4a97-b58e-89653f55ff4a nodeName:}" failed. No retries permitted until 2025-12-05 16:13:45.813969784 +0000 UTC m=+1112.917766164 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs") pod "openstack-operator-controller-manager-57f879d6c4-clslw" (UID: "65f4316e-afb7-4a97-b58e-89653f55ff4a") : secret "webhook-server-cert" not found Dec 05 16:13:45 crc kubenswrapper[4778]: I1205 16:13:45.063754 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert\") pod \"infra-operator-controller-manager-57548d458d-8htgg\" (UID: \"95b0b5ac-33e1-430d-a501-f429b6ccb4fe\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:45 crc kubenswrapper[4778]: E1205 16:13:45.063949 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 16:13:45 crc kubenswrapper[4778]: E1205 16:13:45.064247 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert podName:95b0b5ac-33e1-430d-a501-f429b6ccb4fe nodeName:}" failed. No retries permitted until 2025-12-05 16:13:53.064226169 +0000 UTC m=+1120.168022549 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert") pod "infra-operator-controller-manager-57548d458d-8htgg" (UID: "95b0b5ac-33e1-430d-a501-f429b6ccb4fe") : secret "infra-operator-webhook-server-cert" not found Dec 05 16:13:45 crc kubenswrapper[4778]: I1205 16:13:45.687410 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw\" (UID: \"662da656-0ba6-4ff7-85bd-6739ad5c5100\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:45 crc kubenswrapper[4778]: E1205 16:13:45.687598 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:13:45 crc kubenswrapper[4778]: E1205 16:13:45.687689 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert podName:662da656-0ba6-4ff7-85bd-6739ad5c5100 nodeName:}" failed. No retries permitted until 2025-12-05 16:13:53.687663701 +0000 UTC m=+1120.791460091 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" (UID: "662da656-0ba6-4ff7-85bd-6739ad5c5100") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:13:45 crc kubenswrapper[4778]: I1205 16:13:45.889918 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:45 crc kubenswrapper[4778]: I1205 16:13:45.890045 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:45 crc kubenswrapper[4778]: E1205 16:13:45.890081 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 16:13:45 crc kubenswrapper[4778]: E1205 16:13:45.890171 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs podName:65f4316e-afb7-4a97-b58e-89653f55ff4a nodeName:}" failed. No retries permitted until 2025-12-05 16:13:53.890150103 +0000 UTC m=+1120.993946493 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs") pod "openstack-operator-controller-manager-57f879d6c4-clslw" (UID: "65f4316e-afb7-4a97-b58e-89653f55ff4a") : secret "metrics-server-cert" not found Dec 05 16:13:45 crc kubenswrapper[4778]: E1205 16:13:45.890190 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 16:13:45 crc kubenswrapper[4778]: E1205 16:13:45.890240 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs podName:65f4316e-afb7-4a97-b58e-89653f55ff4a nodeName:}" failed. No retries permitted until 2025-12-05 16:13:53.890228235 +0000 UTC m=+1120.994024615 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs") pod "openstack-operator-controller-manager-57f879d6c4-clslw" (UID: "65f4316e-afb7-4a97-b58e-89653f55ff4a") : secret "webhook-server-cert" not found Dec 05 16:13:53 crc kubenswrapper[4778]: E1205 16:13:53.047222 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 05 16:13:53 crc kubenswrapper[4778]: E1205 16:13:53.047888 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kqs89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-jx9x4_openstack-operators(4b11c75e-cbea-4850-b090-a231f3908b53): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:13:53 crc kubenswrapper[4778]: I1205 16:13:53.103229 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert\") pod \"infra-operator-controller-manager-57548d458d-8htgg\" (UID: \"95b0b5ac-33e1-430d-a501-f429b6ccb4fe\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:53 crc kubenswrapper[4778]: I1205 16:13:53.113472 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95b0b5ac-33e1-430d-a501-f429b6ccb4fe-cert\") pod \"infra-operator-controller-manager-57548d458d-8htgg\" (UID: \"95b0b5ac-33e1-430d-a501-f429b6ccb4fe\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:53 crc kubenswrapper[4778]: I1205 16:13:53.394132 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:13:53 crc kubenswrapper[4778]: I1205 16:13:53.710747 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw\" (UID: \"662da656-0ba6-4ff7-85bd-6739ad5c5100\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:53 crc kubenswrapper[4778]: I1205 16:13:53.716904 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/662da656-0ba6-4ff7-85bd-6739ad5c5100-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw\" (UID: \"662da656-0ba6-4ff7-85bd-6739ad5c5100\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:53 crc kubenswrapper[4778]: I1205 16:13:53.913724 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:53 crc kubenswrapper[4778]: I1205 16:13:53.913904 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:53 crc kubenswrapper[4778]: I1205 16:13:53.917022 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-metrics-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:53 crc kubenswrapper[4778]: I1205 16:13:53.918992 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/65f4316e-afb7-4a97-b58e-89653f55ff4a-webhook-certs\") pod \"openstack-operator-controller-manager-57f879d6c4-clslw\" (UID: \"65f4316e-afb7-4a97-b58e-89653f55ff4a\") " pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:53 crc kubenswrapper[4778]: I1205 16:13:53.940020 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:13:54 crc kubenswrapper[4778]: I1205 16:13:54.200907 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:13:55 crc kubenswrapper[4778]: E1205 16:13:55.670981 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809" Dec 05 16:13:55 crc kubenswrapper[4778]: E1205 16:13:55.671590 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8wx7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-kwnlh_openstack-operators(a1f52513-b5c6-45ac-9cf7-42e04ba8b114): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:13:57 crc kubenswrapper[4778]: E1205 16:13:57.113819 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 05 16:13:57 crc kubenswrapper[4778]: E1205 16:13:57.114609 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j9jhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-r6qhm_openstack-operators(9333a12b-dae0-41c5-a41e-a42c94f5d668): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:13:57 crc kubenswrapper[4778]: E1205 16:13:57.485334 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 05 16:13:57 crc kubenswrapper[4778]: E1205 16:13:57.485785 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qtfg8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-tqqnh_openstack-operators(219350c9-1342-44bb-82d0-6a121ebb354b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:13:59 crc kubenswrapper[4778]: E1205 16:13:59.074073 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 05 16:13:59 crc kubenswrapper[4778]: E1205 16:13:59.074249 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z774k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-z2c7d_openstack-operators(749eef58-2e11-4af9-80d5-b4ab23f257cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:13:59 crc kubenswrapper[4778]: E1205 16:13:59.603117 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 05 16:13:59 crc kubenswrapper[4778]: E1205 16:13:59.603617 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qt248,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-59jpj_openstack-operators(3ba2c006-d33f-4179-8f24-73dcd6231085): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:14:03 crc kubenswrapper[4778]: I1205 16:14:03.415222 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:14:03 crc kubenswrapper[4778]: I1205 16:14:03.415552 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:14:03 crc kubenswrapper[4778]: I1205 16:14:03.415607 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:14:03 crc kubenswrapper[4778]: I1205 16:14:03.416075 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aea0312e36d87c23ce634b679d6ae2137df783585ff65eb7e4e65c9564abd0b6"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:14:03 crc kubenswrapper[4778]: I1205 16:14:03.416138 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://aea0312e36d87c23ce634b679d6ae2137df783585ff65eb7e4e65c9564abd0b6" gracePeriod=600 Dec 05 16:14:04 crc kubenswrapper[4778]: I1205 16:14:04.137782 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="aea0312e36d87c23ce634b679d6ae2137df783585ff65eb7e4e65c9564abd0b6" exitCode=0 Dec 05 16:14:04 crc kubenswrapper[4778]: I1205 16:14:04.137826 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"aea0312e36d87c23ce634b679d6ae2137df783585ff65eb7e4e65c9564abd0b6"} Dec 05 16:14:04 crc kubenswrapper[4778]: I1205 16:14:04.137856 4778 scope.go:117] "RemoveContainer" containerID="d05f43ec797c17341e7f030a46399a4fd0a9ce3922c28bd4bd201675fc830e2a" Dec 05 16:14:27 crc kubenswrapper[4778]: E1205 16:14:27.508342 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385" Dec 05 16:14:27 crc kubenswrapper[4778]: E1205 16:14:27.509226 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s254b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-hc7pn_openstack-operators(862bfade-07c6-405c-bbca-e96341188a5c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:14:28 crc kubenswrapper[4778]: E1205 16:14:28.724141 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 05 16:14:28 crc kubenswrapper[4778]: E1205 16:14:28.725237 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tvnzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-b6727_openstack-operators(48f6bc28-3426-42fb-9498-1280593297ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:14:28 crc kubenswrapper[4778]: E1205 16:14:28.726999 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" podUID="48f6bc28-3426-42fb-9498-1280593297ea" Dec 05 16:14:28 crc kubenswrapper[4778]: E1205 16:14:28.806332 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 16:14:28 crc kubenswrapper[4778]: E1205 16:14:28.806486 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8wx7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-kwnlh_openstack-operators(a1f52513-b5c6-45ac-9cf7-42e04ba8b114): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 16:14:28 crc kubenswrapper[4778]: E1205 16:14:28.807637 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" podUID="a1f52513-b5c6-45ac-9cf7-42e04ba8b114" Dec 05 16:14:28 crc kubenswrapper[4778]: E1205 16:14:28.882410 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346" Dec 05 16:14:28 crc kubenswrapper[4778]: E1205 16:14:28.882489 4778 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346" Dec 05 16:14:28 crc kubenswrapper[4778]: E1205 16:14:28.882707 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.36:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b7w42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6447f74d5-prnmp_openstack-operators(cd874cca-18d0-4bcc-a436-00d6e9bceb9e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:14:28 crc kubenswrapper[4778]: E1205 16:14:28.887523 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 16:14:28 crc kubenswrapper[4778]: E1205 16:14:28.887689 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qt248,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-59jpj_openstack-operators(3ba2c006-d33f-4179-8f24-73dcd6231085): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 16:14:28 crc kubenswrapper[4778]: E1205 16:14:28.888793 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" podUID="3ba2c006-d33f-4179-8f24-73dcd6231085" Dec 05 16:14:29 crc kubenswrapper[4778]: E1205 16:14:29.048358 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 16:14:29 crc kubenswrapper[4778]: E1205 16:14:29.048766 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z774k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-z2c7d_openstack-operators(749eef58-2e11-4af9-80d5-b4ab23f257cc): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 16:14:29 crc kubenswrapper[4778]: E1205 16:14:29.050558 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" podUID="749eef58-2e11-4af9-80d5-b4ab23f257cc" Dec 05 16:14:29 crc kubenswrapper[4778]: I1205 16:14:29.343785 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-8htgg"] Dec 05 16:14:29 crc kubenswrapper[4778]: I1205 16:14:29.354483 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw"] Dec 05 16:14:29 crc kubenswrapper[4778]: I1205 16:14:29.355881 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl" event={"ID":"1e0f0d87-e234-4800-847d-694de5f7dd68","Type":"ContainerStarted","Data":"a7c51bc6ca9a49ed5e697a40aa553021acfdd209a1d7a5df20ee68a66ebd8901"} Dec 05 16:14:29 crc kubenswrapper[4778]: I1205 16:14:29.359531 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-mfkct" event={"ID":"33e87b02-f57e-47bb-934b-159e59f2d7f5","Type":"ContainerStarted","Data":"0257a9704cd2c2f227abe89068315025703ff59335d862fc55e339bf41249792"} Dec 05 16:14:29 crc kubenswrapper[4778]: I1205 16:14:29.364400 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn" event={"ID":"eb7ca6ca-0075-46eb-9a5c-e445d06c3425","Type":"ContainerStarted","Data":"e72b9869c8726a1a1933547377fcbb0d582a2e7df97c7fc0e480de43e6c8b62a"} Dec 05 16:14:29 crc kubenswrapper[4778]: I1205 16:14:29.372227 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw"] Dec 05 16:14:29 crc kubenswrapper[4778]: E1205 16:14:29.778452 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 16:14:29 crc kubenswrapper[4778]: E1205 16:14:29.778739 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qtfg8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-tqqnh_openstack-operators(219350c9-1342-44bb-82d0-6a121ebb354b): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 16:14:29 crc kubenswrapper[4778]: E1205 16:14:29.786519 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" podUID="219350c9-1342-44bb-82d0-6a121ebb354b" Dec 05 16:14:29 crc kubenswrapper[4778]: W1205 16:14:29.800550 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod662da656_0ba6_4ff7_85bd_6739ad5c5100.slice/crio-2e74d5ee58d3788695645f5369f5c5b476e3657f62f4d134e66971b20c067ca7 WatchSource:0}: Error finding container 2e74d5ee58d3788695645f5369f5c5b476e3657f62f4d134e66971b20c067ca7: Status 404 returned error can't find the container with id 2e74d5ee58d3788695645f5369f5c5b476e3657f62f4d134e66971b20c067ca7 Dec 05 16:14:30 crc kubenswrapper[4778]: E1205 16:14:30.362941 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 16:14:30 crc kubenswrapper[4778]: E1205 16:14:30.363427 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j9jhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-r6qhm_openstack-operators(9333a12b-dae0-41c5-a41e-a42c94f5d668): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 16:14:30 crc kubenswrapper[4778]: E1205 16:14:30.365244 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" podUID="9333a12b-dae0-41c5-a41e-a42c94f5d668" Dec 05 16:14:30 crc kubenswrapper[4778]: I1205 16:14:30.372291 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" event={"ID":"65f4316e-afb7-4a97-b58e-89653f55ff4a","Type":"ContainerStarted","Data":"2f3c5acfe2cfea8083a230b4976afbd0b996e2348e2025e13db1cabe98c50f8f"} Dec 05 16:14:30 crc kubenswrapper[4778]: I1205 16:14:30.376860 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" event={"ID":"662da656-0ba6-4ff7-85bd-6739ad5c5100","Type":"ContainerStarted","Data":"2e74d5ee58d3788695645f5369f5c5b476e3657f62f4d134e66971b20c067ca7"} Dec 05 16:14:30 crc kubenswrapper[4778]: I1205 16:14:30.378102 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b" event={"ID":"fe21af78-21d5-440b-977a-1accce9c5ed3","Type":"ContainerStarted","Data":"2d0308c61c5eb1a2e6957d1fb87cbf06868040b7f6e3485a72c27dc76dca0013"} Dec 05 16:14:30 crc kubenswrapper[4778]: I1205 16:14:30.379322 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" event={"ID":"996d89ac-bb27-41d3-9ea8-171d71c585e2","Type":"ContainerStarted","Data":"f7c9dfea36d784737627d904159dcbc953656175196a939259143f91de876e2b"} Dec 05 16:14:30 crc kubenswrapper[4778]: I1205 16:14:30.380208 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" event={"ID":"95b0b5ac-33e1-430d-a501-f429b6ccb4fe","Type":"ContainerStarted","Data":"aa5e86d38f0578ebefc8af6f8d027ee644ce9669b785849dade6a8399da8d8f7"} Dec 05 16:14:30 crc kubenswrapper[4778]: I1205 16:14:30.381941 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"9838a46c7fca5484e5528acba6a6dc7600262ec3d0517e19089e823847361767"} Dec 05 16:14:30 crc kubenswrapper[4778]: I1205 16:14:30.383900 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n" event={"ID":"aee0787e-b460-4811-aaf5-3ad30d1ca069","Type":"ContainerStarted","Data":"4f97d474a79dd91bcc33467e1a216cc7189fa8f1c398f6008abe1da65f4f46f2"} Dec 05 16:14:30 crc kubenswrapper[4778]: I1205 16:14:30.385420 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll" event={"ID":"c0add30f-d439-45c8-93e7-793a49ef95dc","Type":"ContainerStarted","Data":"d6938088d93d509f3631b76a35cf7b3f3d387aeee9c1d824304d461e79f03d63"} Dec 05 16:14:31 crc kubenswrapper[4778]: E1205 16:14:31.371378 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 16:14:31 crc kubenswrapper[4778]: E1205 16:14:31.372171 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kqs89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-jx9x4_openstack-operators(4b11c75e-cbea-4850-b090-a231f3908b53): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 16:14:31 crc kubenswrapper[4778]: E1205 16:14:31.373465 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" podUID="4b11c75e-cbea-4850-b090-a231f3908b53" Dec 05 16:14:31 crc kubenswrapper[4778]: I1205 16:14:31.407172 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" event={"ID":"5f6baea9-4909-4365-8091-d1d4acba26bd","Type":"ContainerStarted","Data":"21a18088ed3191f0c6afd6b71e57a6534e37293e8f868b11d35fe7d6b815c93d"} Dec 05 16:14:31 crc kubenswrapper[4778]: I1205 16:14:31.411100 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w" event={"ID":"b52d37d6-575a-4fa3-96af-4d72413e41e3","Type":"ContainerStarted","Data":"d545733c828e0e96f8f513fd048a1f0476bde144f76059ecbb956a47158fda31"} Dec 05 16:14:31 crc kubenswrapper[4778]: I1205 16:14:31.417438 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" event={"ID":"a2f059f8-ee96-4e29-a00f-bee69430c802","Type":"ContainerStarted","Data":"eb33de64fb6950629d5fa8c84b92d9cb3c00c12123d11290ec5313cbd5ecfc25"} Dec 05 16:14:31 crc kubenswrapper[4778]: I1205 16:14:31.420008 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" event={"ID":"0b521841-47d8-461f-a765-c9b7974bb4b7","Type":"ContainerStarted","Data":"a72bd4134a8a91e505165d0ea00421fd7d361da71d6750ecf35726d9596579c2"} Dec 05 16:14:31 crc kubenswrapper[4778]: E1205 16:14:31.932284 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" podUID="a1f52513-b5c6-45ac-9cf7-42e04ba8b114" Dec 05 16:14:31 crc kubenswrapper[4778]: E1205 16:14:31.941128 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" podUID="749eef58-2e11-4af9-80d5-b4ab23f257cc" Dec 05 16:14:32 crc kubenswrapper[4778]: E1205 16:14:32.016645 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" podUID="3ba2c006-d33f-4179-8f24-73dcd6231085" Dec 05 16:14:32 crc kubenswrapper[4778]: E1205 16:14:32.017999 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" podUID="219350c9-1342-44bb-82d0-6a121ebb354b" Dec 05 16:14:32 crc kubenswrapper[4778]: I1205 16:14:32.436075 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" event={"ID":"4b11c75e-cbea-4850-b090-a231f3908b53","Type":"ContainerStarted","Data":"c7d6fa9e09d81eb683e1143f0030cc7a4ce6c91fd9e75adc7148544b62ef9843"} Dec 05 16:14:32 crc kubenswrapper[4778]: I1205 16:14:32.438150 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" event={"ID":"65f4316e-afb7-4a97-b58e-89653f55ff4a","Type":"ContainerStarted","Data":"5cf915586fdbb930d6cb412af78c5844f4b27d3cb62472006e54127b1e9c8daa"} Dec 05 16:14:32 crc kubenswrapper[4778]: I1205 16:14:32.440086 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:14:32 crc kubenswrapper[4778]: I1205 16:14:32.442461 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" event={"ID":"749eef58-2e11-4af9-80d5-b4ab23f257cc","Type":"ContainerStarted","Data":"346efb1dde9a1547a67ccd3cb5c6222e36e75ac261f32c24b3c6467ee72e3a50"} Dec 05 16:14:32 crc kubenswrapper[4778]: I1205 16:14:32.447804 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" Dec 05 16:14:32 crc kubenswrapper[4778]: I1205 16:14:32.468828 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" event={"ID":"3ba2c006-d33f-4179-8f24-73dcd6231085","Type":"ContainerStarted","Data":"3deecf999cd1c52059e480abad12d31bb8502ec930ab20c5d16b7f716dbb83db"} Dec 05 16:14:32 crc kubenswrapper[4778]: I1205 16:14:32.469566 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" Dec 05 16:14:32 crc kubenswrapper[4778]: I1205 16:14:32.473832 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" podStartSLOduration=55.473811382 podStartE2EDuration="55.473811382s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:14:32.469902998 +0000 UTC m=+1159.573699378" watchObservedRunningTime="2025-12-05 16:14:32.473811382 +0000 UTC m=+1159.577607762" Dec 05 16:14:32 crc kubenswrapper[4778]: I1205 16:14:32.478675 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" event={"ID":"a1f52513-b5c6-45ac-9cf7-42e04ba8b114","Type":"ContainerStarted","Data":"10bdf127470e312ede6cfaaeb65a084533ec1ed28049fd426f33bc1b8f6b7855"} Dec 05 16:14:32 crc kubenswrapper[4778]: I1205 16:14:32.479706 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" Dec 05 16:14:32 crc kubenswrapper[4778]: I1205 16:14:32.485781 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" event={"ID":"219350c9-1342-44bb-82d0-6a121ebb354b","Type":"ContainerStarted","Data":"d5e99425c25c1a867e980392221315c8cdb0541333223d9117ea07be2e4ceda0"} Dec 05 16:14:32 crc kubenswrapper[4778]: I1205 16:14:32.486621 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" Dec 05 16:14:32 crc kubenswrapper[4778]: E1205 16:14:32.733942 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" podUID="749eef58-2e11-4af9-80d5-b4ab23f257cc" Dec 05 16:14:32 crc kubenswrapper[4778]: E1205 16:14:32.739411 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" podUID="219350c9-1342-44bb-82d0-6a121ebb354b" Dec 05 16:14:32 crc kubenswrapper[4778]: E1205 16:14:32.740216 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" podUID="3ba2c006-d33f-4179-8f24-73dcd6231085" Dec 05 16:14:32 crc kubenswrapper[4778]: E1205 16:14:32.740229 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" podUID="a1f52513-b5c6-45ac-9cf7-42e04ba8b114" Dec 05 16:14:32 crc kubenswrapper[4778]: E1205 16:14:32.740112 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" podUID="4b11c75e-cbea-4850-b090-a231f3908b53" Dec 05 16:14:33 crc kubenswrapper[4778]: E1205 16:14:33.009030 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" podUID="9333a12b-dae0-41c5-a41e-a42c94f5d668" Dec 05 16:14:33 crc kubenswrapper[4778]: I1205 16:14:33.523203 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" event={"ID":"9333a12b-dae0-41c5-a41e-a42c94f5d668","Type":"ContainerStarted","Data":"cddac74a295c20f19feaab8546174bab9525f38bbf23bcc4cf915ff34f7ae98e"} Dec 05 16:14:33 crc kubenswrapper[4778]: E1205 16:14:33.526074 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" podUID="4b11c75e-cbea-4850-b090-a231f3908b53" Dec 05 16:14:33 crc kubenswrapper[4778]: E1205 16:14:33.526716 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" podUID="3ba2c006-d33f-4179-8f24-73dcd6231085" Dec 05 16:14:33 crc kubenswrapper[4778]: E1205 16:14:33.526877 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" podUID="a1f52513-b5c6-45ac-9cf7-42e04ba8b114" Dec 05 16:14:33 crc kubenswrapper[4778]: E1205 16:14:33.527585 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" podUID="9333a12b-dae0-41c5-a41e-a42c94f5d668" Dec 05 16:14:33 crc kubenswrapper[4778]: E1205 16:14:33.527697 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" podUID="749eef58-2e11-4af9-80d5-b4ab23f257cc" Dec 05 16:14:33 crc kubenswrapper[4778]: E1205 16:14:33.527755 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" podUID="219350c9-1342-44bb-82d0-6a121ebb354b" Dec 05 16:14:36 crc kubenswrapper[4778]: E1205 16:14:36.284069 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" Dec 05 16:14:36 crc kubenswrapper[4778]: I1205 16:14:36.548539 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll" event={"ID":"c0add30f-d439-45c8-93e7-793a49ef95dc","Type":"ContainerStarted","Data":"22e9623d946cc9e9a8765e792a91220400caa972a22f98bd9b60fd20a64bcb94"} Dec 05 16:14:36 crc kubenswrapper[4778]: I1205 16:14:36.550182 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll" Dec 05 16:14:36 crc kubenswrapper[4778]: I1205 16:14:36.551380 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" event={"ID":"cd874cca-18d0-4bcc-a436-00d6e9bceb9e","Type":"ContainerStarted","Data":"f0e9843bf43526b483227b11cbcdb0ba4770aafa05c05dacd7d244a2123073f9"} Dec 05 16:14:36 crc kubenswrapper[4778]: E1205 16:14:36.552558 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" Dec 05 16:14:36 crc kubenswrapper[4778]: I1205 16:14:36.553131 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll" Dec 05 16:14:36 crc kubenswrapper[4778]: I1205 16:14:36.553440 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" event={"ID":"95b0b5ac-33e1-430d-a501-f429b6ccb4fe","Type":"ContainerStarted","Data":"7ee7bf57b98dd38f9362affc28760828b3c5824a9f216b18fc58776c5f219843"} Dec 05 16:14:36 crc kubenswrapper[4778]: I1205 16:14:36.555540 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n" event={"ID":"aee0787e-b460-4811-aaf5-3ad30d1ca069","Type":"ContainerStarted","Data":"7a86947ff4bd6dd17f64f3909dce4933344ae87ea8e8293ab2ca83be5c232a11"} Dec 05 16:14:36 crc kubenswrapper[4778]: I1205 16:14:36.555715 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n" Dec 05 16:14:36 crc kubenswrapper[4778]: I1205 16:14:36.557184 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" event={"ID":"a2f059f8-ee96-4e29-a00f-bee69430c802","Type":"ContainerStarted","Data":"26aedc8bc776a41640cc0d995845c842a88cfea81a752d328d89332bc4685e42"} Dec 05 16:14:36 crc kubenswrapper[4778]: I1205 16:14:36.557532 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n" Dec 05 16:14:36 crc kubenswrapper[4778]: I1205 16:14:36.574481 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lqlll" podStartSLOduration=3.171884135 podStartE2EDuration="59.574458901s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.438977674 +0000 UTC m=+1106.542774044" lastFinishedPulling="2025-12-05 16:14:35.8415524 +0000 UTC m=+1162.945348810" observedRunningTime="2025-12-05 16:14:36.567558728 +0000 UTC m=+1163.671355108" watchObservedRunningTime="2025-12-05 16:14:36.574458901 +0000 UTC m=+1163.678255301" Dec 05 16:14:36 crc kubenswrapper[4778]: I1205 16:14:36.653401 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-lvc5n" podStartSLOduration=3.43462412 podStartE2EDuration="59.653379749s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.552477192 +0000 UTC m=+1106.656273572" lastFinishedPulling="2025-12-05 16:14:35.771232821 +0000 UTC m=+1162.875029201" observedRunningTime="2025-12-05 16:14:36.648950531 +0000 UTC m=+1163.752746921" watchObservedRunningTime="2025-12-05 16:14:36.653379749 +0000 UTC m=+1163.757176129" Dec 05 16:14:36 crc kubenswrapper[4778]: E1205 16:14:36.720814 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" podUID="862bfade-07c6-405c-bbca-e96341188a5c" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.378007 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.408033 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.556002 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.569153 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" event={"ID":"5f6baea9-4909-4365-8091-d1d4acba26bd","Type":"ContainerStarted","Data":"a24259b8969b39b4b53f1cf44d037accb463bb49a995f9b08ea46bd2b024d941"} Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.569384 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.570990 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" event={"ID":"996d89ac-bb27-41d3-9ea8-171d71c585e2","Type":"ContainerStarted","Data":"6241e4719686254b7372ac911a60c9bec180392435ff361aeca3af65609da4e9"} Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.571274 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.571327 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.575577 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.586956 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" event={"ID":"95b0b5ac-33e1-430d-a501-f429b6ccb4fe","Type":"ContainerStarted","Data":"4350c7a70e4dc96775a131decd11bac5a3a0ad0f3ba3a86e677906ba56449110"} Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.587426 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.610074 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-mfkct" event={"ID":"33e87b02-f57e-47bb-934b-159e59f2d7f5","Type":"ContainerStarted","Data":"2591c648efa4dfdd5536b63e8dac8f8133b4471959e3da197e3f01efbc6f1a1c"} Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.610247 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-mfkct" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.613042 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-mfkct" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.626317 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rcr6s" podStartSLOduration=4.635677496 podStartE2EDuration="1m0.62629884s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.849730253 +0000 UTC m=+1106.953526643" lastFinishedPulling="2025-12-05 16:14:35.840351607 +0000 UTC m=+1162.944147987" observedRunningTime="2025-12-05 16:14:37.623419353 +0000 UTC m=+1164.727215733" watchObservedRunningTime="2025-12-05 16:14:37.62629884 +0000 UTC m=+1164.730095220" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.652779 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn" event={"ID":"eb7ca6ca-0075-46eb-9a5c-e445d06c3425","Type":"ContainerStarted","Data":"1a84ff77d761fc9994834beae883fc14b079c780320c9270db8c9f29037883ed"} Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.653969 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.665473 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" event={"ID":"0b521841-47d8-461f-a765-c9b7974bb4b7","Type":"ContainerStarted","Data":"97726710c1903f633b9a9d1b588668678881ea9e0da60bdc7800327a06d6cf64"} Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.666995 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.671207 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.686627 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.691616 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7cqh2" podStartSLOduration=5.439379686 podStartE2EDuration="1m1.691598485s" podCreationTimestamp="2025-12-05 16:13:36 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.58140059 +0000 UTC m=+1106.685196960" lastFinishedPulling="2025-12-05 16:14:35.833619379 +0000 UTC m=+1162.937415759" observedRunningTime="2025-12-05 16:14:37.686630024 +0000 UTC m=+1164.790426404" watchObservedRunningTime="2025-12-05 16:14:37.691598485 +0000 UTC m=+1164.795394865" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.718560 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" event={"ID":"662da656-0ba6-4ff7-85bd-6739ad5c5100","Type":"ContainerStarted","Data":"feeb7ae8cdbe0c990557c6724d3e5a4719cd7749456d38cc72f7113e97a9cbb2"} Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.718842 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" event={"ID":"662da656-0ba6-4ff7-85bd-6739ad5c5100","Type":"ContainerStarted","Data":"52b208ab623da0e6a2edc42995eaa5e1bc1081b1285fe86ac386c530d1f756b6"} Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.719774 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.737982 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" podStartSLOduration=54.726664301 podStartE2EDuration="1m0.737965908s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:14:29.820839413 +0000 UTC m=+1156.924635803" lastFinishedPulling="2025-12-05 16:14:35.83214103 +0000 UTC m=+1162.935937410" observedRunningTime="2025-12-05 16:14:37.737840455 +0000 UTC m=+1164.841636845" watchObservedRunningTime="2025-12-05 16:14:37.737965908 +0000 UTC m=+1164.841762288" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.740408 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.740490 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl" event={"ID":"1e0f0d87-e234-4800-847d-694de5f7dd68","Type":"ContainerStarted","Data":"6158fc08adadbcb723a0fc2de2858ef2a3680e373fe54fe138791736a39363c7"} Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.740901 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.743021 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w" event={"ID":"b52d37d6-575a-4fa3-96af-4d72413e41e3","Type":"ContainerStarted","Data":"707638fcd41067a65e0549773ef16e98152de5161839fa81e078a1431c4887ee"} Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.750789 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.751163 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.754539 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b" event={"ID":"fe21af78-21d5-440b-977a-1accce9c5ed3","Type":"ContainerStarted","Data":"731ec72b61b134cf5c7f49ae8a1fd440a20eb6e08003348d94c92ae9b4ff2a24"} Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.766665 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.780189 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.781277 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" event={"ID":"862bfade-07c6-405c-bbca-e96341188a5c","Type":"ContainerStarted","Data":"724525e80ca4d313ca50e98aa6931304884b842ad128429ca8ec32964f22be27"} Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.782528 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" Dec 05 16:14:37 crc kubenswrapper[4778]: E1205 16:14:37.784572 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" podUID="862bfade-07c6-405c-bbca-e96341188a5c" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.789248 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.801532 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" podStartSLOduration=54.776750843 podStartE2EDuration="1m0.801515258s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:14:29.802485935 +0000 UTC m=+1156.906282315" lastFinishedPulling="2025-12-05 16:14:35.82725033 +0000 UTC m=+1162.931046730" observedRunningTime="2025-12-05 16:14:37.782824861 +0000 UTC m=+1164.886621261" watchObservedRunningTime="2025-12-05 16:14:37.801515258 +0000 UTC m=+1164.905311638" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.861461 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4828x" podStartSLOduration=4.616100104 podStartE2EDuration="1m0.86144344s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.592829884 +0000 UTC m=+1106.696626264" lastFinishedPulling="2025-12-05 16:14:35.83817322 +0000 UTC m=+1162.941969600" observedRunningTime="2025-12-05 16:14:37.830321213 +0000 UTC m=+1164.934117593" watchObservedRunningTime="2025-12-05 16:14:37.86144344 +0000 UTC m=+1164.965239820" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.863660 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-krhmn" podStartSLOduration=5.117938143 podStartE2EDuration="1m1.863651779s" podCreationTimestamp="2025-12-05 16:13:36 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.099499931 +0000 UTC m=+1106.203296311" lastFinishedPulling="2025-12-05 16:14:35.845213567 +0000 UTC m=+1162.949009947" observedRunningTime="2025-12-05 16:14:37.858663377 +0000 UTC m=+1164.962459767" watchObservedRunningTime="2025-12-05 16:14:37.863651779 +0000 UTC m=+1164.967448159" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.895880 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-mfkct" podStartSLOduration=4.859908435 podStartE2EDuration="1m0.895860455s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.815953185 +0000 UTC m=+1106.919749565" lastFinishedPulling="2025-12-05 16:14:35.851905195 +0000 UTC m=+1162.955701585" observedRunningTime="2025-12-05 16:14:37.890677597 +0000 UTC m=+1164.994473997" watchObservedRunningTime="2025-12-05 16:14:37.895860455 +0000 UTC m=+1164.999656835" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.904156 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.920645 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w" podStartSLOduration=4.655777038 podStartE2EDuration="1m0.920627243s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.56746375 +0000 UTC m=+1106.671260130" lastFinishedPulling="2025-12-05 16:14:35.832313945 +0000 UTC m=+1162.936110335" observedRunningTime="2025-12-05 16:14:37.914645745 +0000 UTC m=+1165.018442125" watchObservedRunningTime="2025-12-05 16:14:37.920627243 +0000 UTC m=+1165.024423623" Dec 05 16:14:37 crc kubenswrapper[4778]: I1205 16:14:37.990879 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-hgg7b" podStartSLOduration=4.649689697 podStartE2EDuration="1m0.99086103s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.498875106 +0000 UTC m=+1106.602671486" lastFinishedPulling="2025-12-05 16:14:35.840046439 +0000 UTC m=+1162.943842819" observedRunningTime="2025-12-05 16:14:37.960073392 +0000 UTC m=+1165.063869762" watchObservedRunningTime="2025-12-05 16:14:37.99086103 +0000 UTC m=+1165.094657410" Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.023577 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-l8jdl" podStartSLOduration=5.364354913 podStartE2EDuration="1m2.02355826s" podCreationTimestamp="2025-12-05 16:13:36 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.112079525 +0000 UTC m=+1106.215875905" lastFinishedPulling="2025-12-05 16:14:35.771282872 +0000 UTC m=+1162.875079252" observedRunningTime="2025-12-05 16:14:38.016085651 +0000 UTC m=+1165.119882031" watchObservedRunningTime="2025-12-05 16:14:38.02355826 +0000 UTC m=+1165.127354640" Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.024507 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.067884 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jxckc" podStartSLOduration=5.08199776 podStartE2EDuration="1m1.067867988s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.859484662 +0000 UTC m=+1106.963281042" lastFinishedPulling="2025-12-05 16:14:35.84535488 +0000 UTC m=+1162.949151270" observedRunningTime="2025-12-05 16:14:38.045413441 +0000 UTC m=+1165.149209821" watchObservedRunningTime="2025-12-05 16:14:38.067867988 +0000 UTC m=+1165.171664368" Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.067997 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.788796 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" event={"ID":"3ba2c006-d33f-4179-8f24-73dcd6231085","Type":"ContainerStarted","Data":"f4cf340cb92f04f4f423274e7114813fe56623a9bd843facdb77da2c903c9eee"} Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.790566 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" event={"ID":"9333a12b-dae0-41c5-a41e-a42c94f5d668","Type":"ContainerStarted","Data":"7e4a0d4f91a346a72dd94cc6201438a69a9868e5b9fbce213d64e706a0092052"} Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.792992 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" event={"ID":"a1f52513-b5c6-45ac-9cf7-42e04ba8b114","Type":"ContainerStarted","Data":"6cffdb858a839285fa08e3b2785372e3b77581c4109666f07807fe93644d45a0"} Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.795054 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" event={"ID":"219350c9-1342-44bb-82d0-6a121ebb354b","Type":"ContainerStarted","Data":"594e72560f2b835985cb12697206cf1a23f9f3ee05a207fd4aff54b4961f55c2"} Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.797243 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" event={"ID":"4b11c75e-cbea-4850-b090-a231f3908b53","Type":"ContainerStarted","Data":"620fda20ca80a67e6c2ca714927fd759ce543e5be5994e4381b609b2472cb869"} Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.801045 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" event={"ID":"749eef58-2e11-4af9-80d5-b4ab23f257cc","Type":"ContainerStarted","Data":"88f62400d6c3317d4e51a056b831d87c0e83fc880c75a2adbc19137eb6525b38"} Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.805511 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w" Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.809667 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhg2w" Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.816074 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-59jpj" podStartSLOduration=9.969835813 podStartE2EDuration="1m1.816054565s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.516628609 +0000 UTC m=+1106.620424989" lastFinishedPulling="2025-12-05 16:14:31.362847361 +0000 UTC m=+1158.466643741" observedRunningTime="2025-12-05 16:14:38.813956069 +0000 UTC m=+1165.917752499" watchObservedRunningTime="2025-12-05 16:14:38.816054565 +0000 UTC m=+1165.919850955" Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.851185 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jx9x4" podStartSLOduration=9.24632153 podStartE2EDuration="1m1.851163288s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.487338679 +0000 UTC m=+1106.591135059" lastFinishedPulling="2025-12-05 16:14:32.092180437 +0000 UTC m=+1159.195976817" observedRunningTime="2025-12-05 16:14:38.833330204 +0000 UTC m=+1165.937126604" watchObservedRunningTime="2025-12-05 16:14:38.851163288 +0000 UTC m=+1165.954959668" Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.907033 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kwnlh" podStartSLOduration=11.044713752 podStartE2EDuration="1m2.907016463s" podCreationTimestamp="2025-12-05 16:13:36 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.502734609 +0000 UTC m=+1106.606530989" lastFinishedPulling="2025-12-05 16:14:31.36503732 +0000 UTC m=+1158.468833700" observedRunningTime="2025-12-05 16:14:38.903782487 +0000 UTC m=+1166.007578877" watchObservedRunningTime="2025-12-05 16:14:38.907016463 +0000 UTC m=+1166.010812843" Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.953835 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-r6qhm" podStartSLOduration=8.699838145 podStartE2EDuration="1m1.953820767s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.489102267 +0000 UTC m=+1106.592898647" lastFinishedPulling="2025-12-05 16:14:32.743084889 +0000 UTC m=+1159.846881269" observedRunningTime="2025-12-05 16:14:38.932948302 +0000 UTC m=+1166.036744682" watchObservedRunningTime="2025-12-05 16:14:38.953820767 +0000 UTC m=+1166.057617147" Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.957151 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-z2c7d" podStartSLOduration=9.975678576 podStartE2EDuration="1m1.957137825s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.547969011 +0000 UTC m=+1106.651765401" lastFinishedPulling="2025-12-05 16:14:31.52942827 +0000 UTC m=+1158.633224650" observedRunningTime="2025-12-05 16:14:38.951798273 +0000 UTC m=+1166.055594653" watchObservedRunningTime="2025-12-05 16:14:38.957137825 +0000 UTC m=+1166.060934205" Dec 05 16:14:38 crc kubenswrapper[4778]: I1205 16:14:38.969194 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-tqqnh" podStartSLOduration=9.92496928 podStartE2EDuration="1m1.969173035s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.555271026 +0000 UTC m=+1106.659067406" lastFinishedPulling="2025-12-05 16:14:31.599474781 +0000 UTC m=+1158.703271161" observedRunningTime="2025-12-05 16:14:38.966468843 +0000 UTC m=+1166.070265223" watchObservedRunningTime="2025-12-05 16:14:38.969173035 +0000 UTC m=+1166.072969645" Dec 05 16:14:41 crc kubenswrapper[4778]: E1205 16:14:41.251587 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" podUID="48f6bc28-3426-42fb-9498-1280593297ea" Dec 05 16:14:43 crc kubenswrapper[4778]: I1205 16:14:43.401727 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-8htgg" Dec 05 16:14:43 crc kubenswrapper[4778]: I1205 16:14:43.947978 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw" Dec 05 16:14:44 crc kubenswrapper[4778]: I1205 16:14:44.207214 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-57f879d6c4-clslw" Dec 05 16:14:50 crc kubenswrapper[4778]: I1205 16:14:50.917295 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" event={"ID":"862bfade-07c6-405c-bbca-e96341188a5c","Type":"ContainerStarted","Data":"e02c07c64c7f7c5163c988c4860740a46933f19374e8bb23a55637cc78f9763c"} Dec 05 16:14:50 crc kubenswrapper[4778]: I1205 16:14:50.918076 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" Dec 05 16:14:50 crc kubenswrapper[4778]: I1205 16:14:50.936265 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" podStartSLOduration=2.8295119140000002 podStartE2EDuration="1m13.936247481s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.590523152 +0000 UTC m=+1106.694319532" lastFinishedPulling="2025-12-05 16:14:50.697258719 +0000 UTC m=+1177.801055099" observedRunningTime="2025-12-05 16:14:50.934056952 +0000 UTC m=+1178.037853342" watchObservedRunningTime="2025-12-05 16:14:50.936247481 +0000 UTC m=+1178.040043861" Dec 05 16:14:51 crc kubenswrapper[4778]: I1205 16:14:51.929229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" event={"ID":"cd874cca-18d0-4bcc-a436-00d6e9bceb9e","Type":"ContainerStarted","Data":"4dedacfd4f3f3e0fd3fb6f4b856e2aa470fbdd74d16c266e7ef93297ce7ed4b0"} Dec 05 16:14:51 crc kubenswrapper[4778]: I1205 16:14:51.929906 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" Dec 05 16:14:51 crc kubenswrapper[4778]: I1205 16:14:51.949684 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" podStartSLOduration=3.984445414 podStartE2EDuration="1m14.949667229s" podCreationTimestamp="2025-12-05 16:13:37 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.86543731 +0000 UTC m=+1106.969233700" lastFinishedPulling="2025-12-05 16:14:50.830659135 +0000 UTC m=+1177.934455515" observedRunningTime="2025-12-05 16:14:51.949307689 +0000 UTC m=+1179.053104069" watchObservedRunningTime="2025-12-05 16:14:51.949667229 +0000 UTC m=+1179.053463609" Dec 05 16:14:56 crc kubenswrapper[4778]: I1205 16:14:56.974883 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" event={"ID":"48f6bc28-3426-42fb-9498-1280593297ea","Type":"ContainerStarted","Data":"a6ef1a8c1108c7ff750264bd03a188fc9f4fe8dfe8c2f8f6172a609fbccbd57e"} Dec 05 16:14:56 crc kubenswrapper[4778]: I1205 16:14:56.993791 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6727" podStartSLOduration=2.527472908 podStartE2EDuration="1m18.993772126s" podCreationTimestamp="2025-12-05 16:13:38 +0000 UTC" firstStartedPulling="2025-12-05 16:13:39.851026707 +0000 UTC m=+1106.954823087" lastFinishedPulling="2025-12-05 16:14:56.317325925 +0000 UTC m=+1183.421122305" observedRunningTime="2025-12-05 16:14:56.992457051 +0000 UTC m=+1184.096253471" watchObservedRunningTime="2025-12-05 16:14:56.993772126 +0000 UTC m=+1184.097568506" Dec 05 16:14:58 crc kubenswrapper[4778]: I1205 16:14:58.471382 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hc7pn" Dec 05 16:14:58 crc kubenswrapper[4778]: I1205 16:14:58.533283 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.139310 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn"] Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.141733 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.144418 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.144471 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.151930 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn"] Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.200267 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a44f580-21af-464b-a03e-fbd39614e1f9-config-volume\") pod \"collect-profiles-29415855-lmgjn\" (UID: \"4a44f580-21af-464b-a03e-fbd39614e1f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.200345 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkkwd\" (UniqueName: \"kubernetes.io/projected/4a44f580-21af-464b-a03e-fbd39614e1f9-kube-api-access-pkkwd\") pod \"collect-profiles-29415855-lmgjn\" (UID: \"4a44f580-21af-464b-a03e-fbd39614e1f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.200427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a44f580-21af-464b-a03e-fbd39614e1f9-secret-volume\") pod \"collect-profiles-29415855-lmgjn\" (UID: \"4a44f580-21af-464b-a03e-fbd39614e1f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.301682 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a44f580-21af-464b-a03e-fbd39614e1f9-secret-volume\") pod \"collect-profiles-29415855-lmgjn\" (UID: \"4a44f580-21af-464b-a03e-fbd39614e1f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.301853 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a44f580-21af-464b-a03e-fbd39614e1f9-config-volume\") pod \"collect-profiles-29415855-lmgjn\" (UID: \"4a44f580-21af-464b-a03e-fbd39614e1f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.301967 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkkwd\" (UniqueName: \"kubernetes.io/projected/4a44f580-21af-464b-a03e-fbd39614e1f9-kube-api-access-pkkwd\") pod \"collect-profiles-29415855-lmgjn\" (UID: \"4a44f580-21af-464b-a03e-fbd39614e1f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.302821 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a44f580-21af-464b-a03e-fbd39614e1f9-config-volume\") pod \"collect-profiles-29415855-lmgjn\" (UID: \"4a44f580-21af-464b-a03e-fbd39614e1f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.307736 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a44f580-21af-464b-a03e-fbd39614e1f9-secret-volume\") pod \"collect-profiles-29415855-lmgjn\" (UID: \"4a44f580-21af-464b-a03e-fbd39614e1f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.319919 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkkwd\" (UniqueName: \"kubernetes.io/projected/4a44f580-21af-464b-a03e-fbd39614e1f9-kube-api-access-pkkwd\") pod \"collect-profiles-29415855-lmgjn\" (UID: \"4a44f580-21af-464b-a03e-fbd39614e1f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:00 crc kubenswrapper[4778]: I1205 16:15:00.469016 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:01 crc kubenswrapper[4778]: I1205 16:15:01.001251 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn"] Dec 05 16:15:01 crc kubenswrapper[4778]: W1205 16:15:01.007602 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a44f580_21af_464b_a03e_fbd39614e1f9.slice/crio-73bffe13c362f7496cf57305a2ff59ccce8dc79962b51be407388273c075e0d4 WatchSource:0}: Error finding container 73bffe13c362f7496cf57305a2ff59ccce8dc79962b51be407388273c075e0d4: Status 404 returned error can't find the container with id 73bffe13c362f7496cf57305a2ff59ccce8dc79962b51be407388273c075e0d4 Dec 05 16:15:02 crc kubenswrapper[4778]: I1205 16:15:02.010619 4778 generic.go:334] "Generic (PLEG): container finished" podID="4a44f580-21af-464b-a03e-fbd39614e1f9" containerID="8daa378cac1ab97776954f611d4f0378a1a81c33014925887a6950e56ce90baf" exitCode=0 Dec 05 16:15:02 crc kubenswrapper[4778]: I1205 16:15:02.010814 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" event={"ID":"4a44f580-21af-464b-a03e-fbd39614e1f9","Type":"ContainerDied","Data":"8daa378cac1ab97776954f611d4f0378a1a81c33014925887a6950e56ce90baf"} Dec 05 16:15:02 crc kubenswrapper[4778]: I1205 16:15:02.010973 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" event={"ID":"4a44f580-21af-464b-a03e-fbd39614e1f9","Type":"ContainerStarted","Data":"73bffe13c362f7496cf57305a2ff59ccce8dc79962b51be407388273c075e0d4"} Dec 05 16:15:03 crc kubenswrapper[4778]: I1205 16:15:03.269802 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:03 crc kubenswrapper[4778]: I1205 16:15:03.345258 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a44f580-21af-464b-a03e-fbd39614e1f9-secret-volume\") pod \"4a44f580-21af-464b-a03e-fbd39614e1f9\" (UID: \"4a44f580-21af-464b-a03e-fbd39614e1f9\") " Dec 05 16:15:03 crc kubenswrapper[4778]: I1205 16:15:03.345414 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a44f580-21af-464b-a03e-fbd39614e1f9-config-volume\") pod \"4a44f580-21af-464b-a03e-fbd39614e1f9\" (UID: \"4a44f580-21af-464b-a03e-fbd39614e1f9\") " Dec 05 16:15:03 crc kubenswrapper[4778]: I1205 16:15:03.345469 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkkwd\" (UniqueName: \"kubernetes.io/projected/4a44f580-21af-464b-a03e-fbd39614e1f9-kube-api-access-pkkwd\") pod \"4a44f580-21af-464b-a03e-fbd39614e1f9\" (UID: \"4a44f580-21af-464b-a03e-fbd39614e1f9\") " Dec 05 16:15:03 crc kubenswrapper[4778]: I1205 16:15:03.346918 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a44f580-21af-464b-a03e-fbd39614e1f9-config-volume" (OuterVolumeSpecName: "config-volume") pod "4a44f580-21af-464b-a03e-fbd39614e1f9" (UID: "4a44f580-21af-464b-a03e-fbd39614e1f9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:03 crc kubenswrapper[4778]: I1205 16:15:03.351029 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a44f580-21af-464b-a03e-fbd39614e1f9-kube-api-access-pkkwd" (OuterVolumeSpecName: "kube-api-access-pkkwd") pod "4a44f580-21af-464b-a03e-fbd39614e1f9" (UID: "4a44f580-21af-464b-a03e-fbd39614e1f9"). InnerVolumeSpecName "kube-api-access-pkkwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:03 crc kubenswrapper[4778]: I1205 16:15:03.351063 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a44f580-21af-464b-a03e-fbd39614e1f9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4a44f580-21af-464b-a03e-fbd39614e1f9" (UID: "4a44f580-21af-464b-a03e-fbd39614e1f9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:03 crc kubenswrapper[4778]: I1205 16:15:03.447030 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a44f580-21af-464b-a03e-fbd39614e1f9-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:03 crc kubenswrapper[4778]: I1205 16:15:03.447069 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkkwd\" (UniqueName: \"kubernetes.io/projected/4a44f580-21af-464b-a03e-fbd39614e1f9-kube-api-access-pkkwd\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:03 crc kubenswrapper[4778]: I1205 16:15:03.447084 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a44f580-21af-464b-a03e-fbd39614e1f9-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:04 crc kubenswrapper[4778]: I1205 16:15:04.040779 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" event={"ID":"4a44f580-21af-464b-a03e-fbd39614e1f9","Type":"ContainerDied","Data":"73bffe13c362f7496cf57305a2ff59ccce8dc79962b51be407388273c075e0d4"} Dec 05 16:15:04 crc kubenswrapper[4778]: I1205 16:15:04.040827 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73bffe13c362f7496cf57305a2ff59ccce8dc79962b51be407388273c075e0d4" Dec 05 16:15:04 crc kubenswrapper[4778]: I1205 16:15:04.040827 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn" Dec 05 16:15:04 crc kubenswrapper[4778]: I1205 16:15:04.321691 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl"] Dec 05 16:15:04 crc kubenswrapper[4778]: I1205 16:15:04.321956 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" podUID="aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b" containerName="operator" containerID="cri-o://8aa93b891d3edfd33ea9b6b490ceae30099254c54cf2ecc6cd9780117b0d3c2e" gracePeriod=10 Dec 05 16:15:05 crc kubenswrapper[4778]: I1205 16:15:05.049947 4778 generic.go:334] "Generic (PLEG): container finished" podID="aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b" containerID="8aa93b891d3edfd33ea9b6b490ceae30099254c54cf2ecc6cd9780117b0d3c2e" exitCode=0 Dec 05 16:15:05 crc kubenswrapper[4778]: I1205 16:15:05.050032 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" event={"ID":"aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b","Type":"ContainerDied","Data":"8aa93b891d3edfd33ea9b6b490ceae30099254c54cf2ecc6cd9780117b0d3c2e"} Dec 05 16:15:05 crc kubenswrapper[4778]: I1205 16:15:05.836934 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" Dec 05 16:15:05 crc kubenswrapper[4778]: I1205 16:15:05.878439 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7jcs\" (UniqueName: \"kubernetes.io/projected/aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b-kube-api-access-m7jcs\") pod \"aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b\" (UID: \"aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b\") " Dec 05 16:15:05 crc kubenswrapper[4778]: I1205 16:15:05.883481 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b-kube-api-access-m7jcs" (OuterVolumeSpecName: "kube-api-access-m7jcs") pod "aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b" (UID: "aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b"). InnerVolumeSpecName "kube-api-access-m7jcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:05 crc kubenswrapper[4778]: I1205 16:15:05.980275 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7jcs\" (UniqueName: \"kubernetes.io/projected/aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b-kube-api-access-m7jcs\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:06 crc kubenswrapper[4778]: I1205 16:15:06.059474 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" event={"ID":"aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b","Type":"ContainerDied","Data":"9e42c1bf0dc44841a8bf4af2f1b2786e01b216a695f050d92f2fa53ed4ba0623"} Dec 05 16:15:06 crc kubenswrapper[4778]: I1205 16:15:06.059511 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl" Dec 05 16:15:06 crc kubenswrapper[4778]: I1205 16:15:06.059553 4778 scope.go:117] "RemoveContainer" containerID="8aa93b891d3edfd33ea9b6b490ceae30099254c54cf2ecc6cd9780117b0d3c2e" Dec 05 16:15:06 crc kubenswrapper[4778]: I1205 16:15:06.104211 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl"] Dec 05 16:15:06 crc kubenswrapper[4778]: I1205 16:15:06.110628 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d55d8cbb8-cskkl"] Dec 05 16:15:06 crc kubenswrapper[4778]: I1205 16:15:06.280259 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp"] Dec 05 16:15:06 crc kubenswrapper[4778]: I1205 16:15:06.280602 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" containerName="kube-rbac-proxy" containerID="cri-o://f0e9843bf43526b483227b11cbcdb0ba4770aafa05c05dacd7d244a2123073f9" gracePeriod=10 Dec 05 16:15:06 crc kubenswrapper[4778]: I1205 16:15:06.280747 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" containerName="manager" containerID="cri-o://4dedacfd4f3f3e0fd3fb6f4b856e2aa470fbdd74d16c266e7ef93297ce7ed4b0" gracePeriod=10 Dec 05 16:15:07 crc kubenswrapper[4778]: I1205 16:15:07.066932 4778 generic.go:334] "Generic (PLEG): container finished" podID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" containerID="4dedacfd4f3f3e0fd3fb6f4b856e2aa470fbdd74d16c266e7ef93297ce7ed4b0" exitCode=0 Dec 05 16:15:07 crc kubenswrapper[4778]: I1205 16:15:07.067918 4778 generic.go:334] "Generic (PLEG): container finished" podID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" containerID="f0e9843bf43526b483227b11cbcdb0ba4770aafa05c05dacd7d244a2123073f9" exitCode=0 Dec 05 16:15:07 crc kubenswrapper[4778]: I1205 16:15:07.067026 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" event={"ID":"cd874cca-18d0-4bcc-a436-00d6e9bceb9e","Type":"ContainerDied","Data":"4dedacfd4f3f3e0fd3fb6f4b856e2aa470fbdd74d16c266e7ef93297ce7ed4b0"} Dec 05 16:15:07 crc kubenswrapper[4778]: I1205 16:15:07.068068 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" event={"ID":"cd874cca-18d0-4bcc-a436-00d6e9bceb9e","Type":"ContainerDied","Data":"f0e9843bf43526b483227b11cbcdb0ba4770aafa05c05dacd7d244a2123073f9"} Dec 05 16:15:07 crc kubenswrapper[4778]: I1205 16:15:07.260838 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b" path="/var/lib/kubelet/pods/aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b/volumes" Dec 05 16:15:08 crc kubenswrapper[4778]: I1205 16:15:08.532202 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8081/readyz\": dial tcp 10.217.0.95:8081: connect: connection refused" Dec 05 16:15:09 crc kubenswrapper[4778]: I1205 16:15:09.370252 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" Dec 05 16:15:09 crc kubenswrapper[4778]: I1205 16:15:09.443576 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7w42\" (UniqueName: \"kubernetes.io/projected/cd874cca-18d0-4bcc-a436-00d6e9bceb9e-kube-api-access-b7w42\") pod \"cd874cca-18d0-4bcc-a436-00d6e9bceb9e\" (UID: \"cd874cca-18d0-4bcc-a436-00d6e9bceb9e\") " Dec 05 16:15:09 crc kubenswrapper[4778]: I1205 16:15:09.456849 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd874cca-18d0-4bcc-a436-00d6e9bceb9e-kube-api-access-b7w42" (OuterVolumeSpecName: "kube-api-access-b7w42") pod "cd874cca-18d0-4bcc-a436-00d6e9bceb9e" (UID: "cd874cca-18d0-4bcc-a436-00d6e9bceb9e"). InnerVolumeSpecName "kube-api-access-b7w42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:09 crc kubenswrapper[4778]: I1205 16:15:09.544362 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7w42\" (UniqueName: \"kubernetes.io/projected/cd874cca-18d0-4bcc-a436-00d6e9bceb9e-kube-api-access-b7w42\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:10 crc kubenswrapper[4778]: I1205 16:15:10.101708 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" event={"ID":"cd874cca-18d0-4bcc-a436-00d6e9bceb9e","Type":"ContainerDied","Data":"f0dc0d48cf20e7782cc5c0c336ed21f2a3c1b59306fba53e75b81b4e19c69f5c"} Dec 05 16:15:10 crc kubenswrapper[4778]: I1205 16:15:10.101763 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp" Dec 05 16:15:10 crc kubenswrapper[4778]: I1205 16:15:10.101806 4778 scope.go:117] "RemoveContainer" containerID="4dedacfd4f3f3e0fd3fb6f4b856e2aa470fbdd74d16c266e7ef93297ce7ed4b0" Dec 05 16:15:10 crc kubenswrapper[4778]: I1205 16:15:10.129771 4778 scope.go:117] "RemoveContainer" containerID="f0e9843bf43526b483227b11cbcdb0ba4770aafa05c05dacd7d244a2123073f9" Dec 05 16:15:10 crc kubenswrapper[4778]: I1205 16:15:10.138953 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp"] Dec 05 16:15:10 crc kubenswrapper[4778]: I1205 16:15:10.144624 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6447f74d5-prnmp"] Dec 05 16:15:11 crc kubenswrapper[4778]: I1205 16:15:11.259595 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" path="/var/lib/kubelet/pods/cd874cca-18d0-4bcc-a436-00d6e9bceb9e/volumes" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.701487 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-index-fwgv5"] Dec 05 16:15:13 crc kubenswrapper[4778]: E1205 16:15:13.702604 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" containerName="kube-rbac-proxy" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.702689 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" containerName="kube-rbac-proxy" Dec 05 16:15:13 crc kubenswrapper[4778]: E1205 16:15:13.702800 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b" containerName="operator" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.702861 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b" containerName="operator" Dec 05 16:15:13 crc kubenswrapper[4778]: E1205 16:15:13.702960 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a44f580-21af-464b-a03e-fbd39614e1f9" containerName="collect-profiles" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.703026 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a44f580-21af-464b-a03e-fbd39614e1f9" containerName="collect-profiles" Dec 05 16:15:13 crc kubenswrapper[4778]: E1205 16:15:13.703099 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" containerName="manager" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.703155 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" containerName="manager" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.703387 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a44f580-21af-464b-a03e-fbd39614e1f9" containerName="collect-profiles" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.703465 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae67ac1-2aeb-4097-83ab-4ec6d56b7f7b" containerName="operator" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.703558 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" containerName="kube-rbac-proxy" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.703648 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd874cca-18d0-4bcc-a436-00d6e9bceb9e" containerName="manager" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.704442 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-fwgv5" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.707555 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-index-dockercfg-9dwkq" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.729551 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-fwgv5"] Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.845818 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j69sk\" (UniqueName: \"kubernetes.io/projected/4f97e6d4-7346-4696-901f-eb6822513707-kube-api-access-j69sk\") pod \"watcher-operator-index-fwgv5\" (UID: \"4f97e6d4-7346-4696-901f-eb6822513707\") " pod="openstack-operators/watcher-operator-index-fwgv5" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.948073 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j69sk\" (UniqueName: \"kubernetes.io/projected/4f97e6d4-7346-4696-901f-eb6822513707-kube-api-access-j69sk\") pod \"watcher-operator-index-fwgv5\" (UID: \"4f97e6d4-7346-4696-901f-eb6822513707\") " pod="openstack-operators/watcher-operator-index-fwgv5" Dec 05 16:15:13 crc kubenswrapper[4778]: I1205 16:15:13.980917 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j69sk\" (UniqueName: \"kubernetes.io/projected/4f97e6d4-7346-4696-901f-eb6822513707-kube-api-access-j69sk\") pod \"watcher-operator-index-fwgv5\" (UID: \"4f97e6d4-7346-4696-901f-eb6822513707\") " pod="openstack-operators/watcher-operator-index-fwgv5" Dec 05 16:15:14 crc kubenswrapper[4778]: I1205 16:15:14.051084 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-fwgv5" Dec 05 16:15:14 crc kubenswrapper[4778]: I1205 16:15:14.557060 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-fwgv5"] Dec 05 16:15:14 crc kubenswrapper[4778]: W1205 16:15:14.574691 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f97e6d4_7346_4696_901f_eb6822513707.slice/crio-b135fbc5d7bc1a40b2a36b959fdbd2c6480c9dc8ec30b07131aa7850a501a2d6 WatchSource:0}: Error finding container b135fbc5d7bc1a40b2a36b959fdbd2c6480c9dc8ec30b07131aa7850a501a2d6: Status 404 returned error can't find the container with id b135fbc5d7bc1a40b2a36b959fdbd2c6480c9dc8ec30b07131aa7850a501a2d6 Dec 05 16:15:15 crc kubenswrapper[4778]: I1205 16:15:15.140533 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-fwgv5" event={"ID":"4f97e6d4-7346-4696-901f-eb6822513707","Type":"ContainerStarted","Data":"b135fbc5d7bc1a40b2a36b959fdbd2c6480c9dc8ec30b07131aa7850a501a2d6"} Dec 05 16:15:16 crc kubenswrapper[4778]: I1205 16:15:16.149392 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-fwgv5" event={"ID":"4f97e6d4-7346-4696-901f-eb6822513707","Type":"ContainerStarted","Data":"6320a529d5de6f5920d9eca9d66a0b2697abb61f3f9e9caa405cf1ae04f0901f"} Dec 05 16:15:16 crc kubenswrapper[4778]: I1205 16:15:16.169914 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-index-fwgv5" podStartSLOduration=2.106297824 podStartE2EDuration="3.169895506s" podCreationTimestamp="2025-12-05 16:15:13 +0000 UTC" firstStartedPulling="2025-12-05 16:15:14.577345524 +0000 UTC m=+1201.681141904" lastFinishedPulling="2025-12-05 16:15:15.640943186 +0000 UTC m=+1202.744739586" observedRunningTime="2025-12-05 16:15:16.164133303 +0000 UTC m=+1203.267929693" watchObservedRunningTime="2025-12-05 16:15:16.169895506 +0000 UTC m=+1203.273691886" Dec 05 16:15:24 crc kubenswrapper[4778]: I1205 16:15:24.051624 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-index-fwgv5" Dec 05 16:15:24 crc kubenswrapper[4778]: I1205 16:15:24.052051 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/watcher-operator-index-fwgv5" Dec 05 16:15:24 crc kubenswrapper[4778]: I1205 16:15:24.080773 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/watcher-operator-index-fwgv5" Dec 05 16:15:24 crc kubenswrapper[4778]: I1205 16:15:24.242699 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-index-fwgv5" Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.329504 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd"] Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.332171 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.334414 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-d8zml" Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.346760 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd"] Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.455250 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-util\") pod \"b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd\" (UID: \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\") " pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.455304 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-bundle\") pod \"b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd\" (UID: \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\") " pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.456010 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29fc9\" (UniqueName: \"kubernetes.io/projected/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-kube-api-access-29fc9\") pod \"b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd\" (UID: \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\") " pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.557160 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-bundle\") pod \"b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd\" (UID: \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\") " pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.557246 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29fc9\" (UniqueName: \"kubernetes.io/projected/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-kube-api-access-29fc9\") pod \"b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd\" (UID: \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\") " pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.557307 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-util\") pod \"b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd\" (UID: \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\") " pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.558138 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-bundle\") pod \"b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd\" (UID: \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\") " pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.558189 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-util\") pod \"b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd\" (UID: \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\") " pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.580201 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29fc9\" (UniqueName: \"kubernetes.io/projected/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-kube-api-access-29fc9\") pod \"b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd\" (UID: \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\") " pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:27 crc kubenswrapper[4778]: I1205 16:15:27.651399 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:28 crc kubenswrapper[4778]: I1205 16:15:28.054645 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd"] Dec 05 16:15:28 crc kubenswrapper[4778]: I1205 16:15:28.245447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" event={"ID":"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd","Type":"ContainerStarted","Data":"b52852062a73ca2dc88c13d93be0cda86861141c514f8687263b7870ea8862dc"} Dec 05 16:15:29 crc kubenswrapper[4778]: I1205 16:15:29.260422 4778 generic.go:334] "Generic (PLEG): container finished" podID="6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" containerID="3d1dda3d3156f97d934b84f2cbddf8c0872e6da1ff15c9b8b640254724c598ea" exitCode=0 Dec 05 16:15:29 crc kubenswrapper[4778]: I1205 16:15:29.264539 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" event={"ID":"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd","Type":"ContainerDied","Data":"3d1dda3d3156f97d934b84f2cbddf8c0872e6da1ff15c9b8b640254724c598ea"} Dec 05 16:15:30 crc kubenswrapper[4778]: I1205 16:15:30.268465 4778 generic.go:334] "Generic (PLEG): container finished" podID="6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" containerID="be282103a98194749b2f99cd9284d1f163b555b76a4056df91f05b264e1e9446" exitCode=0 Dec 05 16:15:30 crc kubenswrapper[4778]: I1205 16:15:30.268500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" event={"ID":"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd","Type":"ContainerDied","Data":"be282103a98194749b2f99cd9284d1f163b555b76a4056df91f05b264e1e9446"} Dec 05 16:15:32 crc kubenswrapper[4778]: I1205 16:15:32.287538 4778 generic.go:334] "Generic (PLEG): container finished" podID="6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" containerID="ee5f4ba4c06e6392c781f11eece165d56027e50feea7fe72d8deaeb7bfd45e85" exitCode=0 Dec 05 16:15:32 crc kubenswrapper[4778]: I1205 16:15:32.287607 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" event={"ID":"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd","Type":"ContainerDied","Data":"ee5f4ba4c06e6392c781f11eece165d56027e50feea7fe72d8deaeb7bfd45e85"} Dec 05 16:15:33 crc kubenswrapper[4778]: I1205 16:15:33.555339 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:33 crc kubenswrapper[4778]: I1205 16:15:33.749855 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-bundle\") pod \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\" (UID: \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\") " Dec 05 16:15:33 crc kubenswrapper[4778]: I1205 16:15:33.749943 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-util\") pod \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\" (UID: \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\") " Dec 05 16:15:33 crc kubenswrapper[4778]: I1205 16:15:33.750015 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29fc9\" (UniqueName: \"kubernetes.io/projected/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-kube-api-access-29fc9\") pod \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\" (UID: \"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd\") " Dec 05 16:15:33 crc kubenswrapper[4778]: I1205 16:15:33.757023 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-kube-api-access-29fc9" (OuterVolumeSpecName: "kube-api-access-29fc9") pod "6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" (UID: "6a9f95dc-00ca-4792-9fe4-3b25e68b80fd"). InnerVolumeSpecName "kube-api-access-29fc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:33 crc kubenswrapper[4778]: I1205 16:15:33.758941 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-bundle" (OuterVolumeSpecName: "bundle") pod "6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" (UID: "6a9f95dc-00ca-4792-9fe4-3b25e68b80fd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:33 crc kubenswrapper[4778]: I1205 16:15:33.764033 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-util" (OuterVolumeSpecName: "util") pod "6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" (UID: "6a9f95dc-00ca-4792-9fe4-3b25e68b80fd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:33 crc kubenswrapper[4778]: I1205 16:15:33.851161 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29fc9\" (UniqueName: \"kubernetes.io/projected/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-kube-api-access-29fc9\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:33 crc kubenswrapper[4778]: I1205 16:15:33.851510 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:33 crc kubenswrapper[4778]: I1205 16:15:33.851580 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a9f95dc-00ca-4792-9fe4-3b25e68b80fd-util\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:34 crc kubenswrapper[4778]: I1205 16:15:34.303466 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" event={"ID":"6a9f95dc-00ca-4792-9fe4-3b25e68b80fd","Type":"ContainerDied","Data":"b52852062a73ca2dc88c13d93be0cda86861141c514f8687263b7870ea8862dc"} Dec 05 16:15:34 crc kubenswrapper[4778]: I1205 16:15:34.303508 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b52852062a73ca2dc88c13d93be0cda86861141c514f8687263b7870ea8862dc" Dec 05 16:15:34 crc kubenswrapper[4778]: I1205 16:15:34.303551 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.669254 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w"] Dec 05 16:15:39 crc kubenswrapper[4778]: E1205 16:15:39.670106 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" containerName="util" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.670120 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" containerName="util" Dec 05 16:15:39 crc kubenswrapper[4778]: E1205 16:15:39.670148 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" containerName="pull" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.670155 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" containerName="pull" Dec 05 16:15:39 crc kubenswrapper[4778]: E1205 16:15:39.670171 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" containerName="extract" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.670179 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" containerName="extract" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.670324 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9f95dc-00ca-4792-9fe4-3b25e68b80fd" containerName="extract" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.670863 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.673019 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-service-cert" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.674084 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-ch9c7" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.682606 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w"] Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.739629 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50119f5c-0d94-4c41-94bb-e335b6c8614a-apiservice-cert\") pod \"watcher-operator-controller-manager-8558ff9844-wgz5w\" (UID: \"50119f5c-0d94-4c41-94bb-e335b6c8614a\") " pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.739679 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50119f5c-0d94-4c41-94bb-e335b6c8614a-webhook-cert\") pod \"watcher-operator-controller-manager-8558ff9844-wgz5w\" (UID: \"50119f5c-0d94-4c41-94bb-e335b6c8614a\") " pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.739897 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mvml\" (UniqueName: \"kubernetes.io/projected/50119f5c-0d94-4c41-94bb-e335b6c8614a-kube-api-access-8mvml\") pod \"watcher-operator-controller-manager-8558ff9844-wgz5w\" (UID: \"50119f5c-0d94-4c41-94bb-e335b6c8614a\") " pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.841402 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50119f5c-0d94-4c41-94bb-e335b6c8614a-apiservice-cert\") pod \"watcher-operator-controller-manager-8558ff9844-wgz5w\" (UID: \"50119f5c-0d94-4c41-94bb-e335b6c8614a\") " pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.841455 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50119f5c-0d94-4c41-94bb-e335b6c8614a-webhook-cert\") pod \"watcher-operator-controller-manager-8558ff9844-wgz5w\" (UID: \"50119f5c-0d94-4c41-94bb-e335b6c8614a\") " pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.841509 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mvml\" (UniqueName: \"kubernetes.io/projected/50119f5c-0d94-4c41-94bb-e335b6c8614a-kube-api-access-8mvml\") pod \"watcher-operator-controller-manager-8558ff9844-wgz5w\" (UID: \"50119f5c-0d94-4c41-94bb-e335b6c8614a\") " pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.849013 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50119f5c-0d94-4c41-94bb-e335b6c8614a-apiservice-cert\") pod \"watcher-operator-controller-manager-8558ff9844-wgz5w\" (UID: \"50119f5c-0d94-4c41-94bb-e335b6c8614a\") " pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.849877 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50119f5c-0d94-4c41-94bb-e335b6c8614a-webhook-cert\") pod \"watcher-operator-controller-manager-8558ff9844-wgz5w\" (UID: \"50119f5c-0d94-4c41-94bb-e335b6c8614a\") " pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.859751 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mvml\" (UniqueName: \"kubernetes.io/projected/50119f5c-0d94-4c41-94bb-e335b6c8614a-kube-api-access-8mvml\") pod \"watcher-operator-controller-manager-8558ff9844-wgz5w\" (UID: \"50119f5c-0d94-4c41-94bb-e335b6c8614a\") " pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:39 crc kubenswrapper[4778]: I1205 16:15:39.989624 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:40 crc kubenswrapper[4778]: I1205 16:15:40.497133 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w"] Dec 05 16:15:41 crc kubenswrapper[4778]: I1205 16:15:41.359668 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" event={"ID":"50119f5c-0d94-4c41-94bb-e335b6c8614a","Type":"ContainerStarted","Data":"0d576638bb13d5cef960cf958e6ec389e1b9048b1b42747bb5263ff9bd3fb83d"} Dec 05 16:15:41 crc kubenswrapper[4778]: I1205 16:15:41.359979 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" event={"ID":"50119f5c-0d94-4c41-94bb-e335b6c8614a","Type":"ContainerStarted","Data":"e69e7e8a9b43e022d7ef092be09054349c7cd41e861b3c517728736a1db58ba3"} Dec 05 16:15:41 crc kubenswrapper[4778]: I1205 16:15:41.359995 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:41 crc kubenswrapper[4778]: I1205 16:15:41.391613 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" podStartSLOduration=2.391591534 podStartE2EDuration="2.391591534s" podCreationTimestamp="2025-12-05 16:15:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:15:41.379873232 +0000 UTC m=+1228.483669632" watchObservedRunningTime="2025-12-05 16:15:41.391591534 +0000 UTC m=+1228.495387914" Dec 05 16:15:49 crc kubenswrapper[4778]: I1205 16:15:49.996344 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.340829 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm"] Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.342014 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.373714 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm"] Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.503827 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdhtl\" (UniqueName: \"kubernetes.io/projected/94682128-df97-406d-8947-1e8dd8199a99-kube-api-access-wdhtl\") pod \"watcher-operator-controller-manager-5dd64f5b5b-rmlwm\" (UID: \"94682128-df97-406d-8947-1e8dd8199a99\") " pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.503891 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94682128-df97-406d-8947-1e8dd8199a99-webhook-cert\") pod \"watcher-operator-controller-manager-5dd64f5b5b-rmlwm\" (UID: \"94682128-df97-406d-8947-1e8dd8199a99\") " pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.503918 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94682128-df97-406d-8947-1e8dd8199a99-apiservice-cert\") pod \"watcher-operator-controller-manager-5dd64f5b5b-rmlwm\" (UID: \"94682128-df97-406d-8947-1e8dd8199a99\") " pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.605603 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94682128-df97-406d-8947-1e8dd8199a99-apiservice-cert\") pod \"watcher-operator-controller-manager-5dd64f5b5b-rmlwm\" (UID: \"94682128-df97-406d-8947-1e8dd8199a99\") " pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.605716 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdhtl\" (UniqueName: \"kubernetes.io/projected/94682128-df97-406d-8947-1e8dd8199a99-kube-api-access-wdhtl\") pod \"watcher-operator-controller-manager-5dd64f5b5b-rmlwm\" (UID: \"94682128-df97-406d-8947-1e8dd8199a99\") " pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.605758 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94682128-df97-406d-8947-1e8dd8199a99-webhook-cert\") pod \"watcher-operator-controller-manager-5dd64f5b5b-rmlwm\" (UID: \"94682128-df97-406d-8947-1e8dd8199a99\") " pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.613116 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94682128-df97-406d-8947-1e8dd8199a99-webhook-cert\") pod \"watcher-operator-controller-manager-5dd64f5b5b-rmlwm\" (UID: \"94682128-df97-406d-8947-1e8dd8199a99\") " pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.613349 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94682128-df97-406d-8947-1e8dd8199a99-apiservice-cert\") pod \"watcher-operator-controller-manager-5dd64f5b5b-rmlwm\" (UID: \"94682128-df97-406d-8947-1e8dd8199a99\") " pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.629763 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdhtl\" (UniqueName: \"kubernetes.io/projected/94682128-df97-406d-8947-1e8dd8199a99-kube-api-access-wdhtl\") pod \"watcher-operator-controller-manager-5dd64f5b5b-rmlwm\" (UID: \"94682128-df97-406d-8947-1e8dd8199a99\") " pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:15:51 crc kubenswrapper[4778]: I1205 16:15:51.660896 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:15:52 crc kubenswrapper[4778]: I1205 16:15:52.172283 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm"] Dec 05 16:15:52 crc kubenswrapper[4778]: W1205 16:15:52.183923 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94682128_df97_406d_8947_1e8dd8199a99.slice/crio-31b5c76eb8c9829eb8b4de011e88ff826acc7c66e135344fb624b350690eeeed WatchSource:0}: Error finding container 31b5c76eb8c9829eb8b4de011e88ff826acc7c66e135344fb624b350690eeeed: Status 404 returned error can't find the container with id 31b5c76eb8c9829eb8b4de011e88ff826acc7c66e135344fb624b350690eeeed Dec 05 16:15:52 crc kubenswrapper[4778]: I1205 16:15:52.443158 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" event={"ID":"94682128-df97-406d-8947-1e8dd8199a99","Type":"ContainerStarted","Data":"ff2ef4ca014e94ded026babac0f314d7431b6d5556fd12d50cb1bd3356d2716b"} Dec 05 16:15:52 crc kubenswrapper[4778]: I1205 16:15:52.443593 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:15:52 crc kubenswrapper[4778]: I1205 16:15:52.443961 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" event={"ID":"94682128-df97-406d-8947-1e8dd8199a99","Type":"ContainerStarted","Data":"31b5c76eb8c9829eb8b4de011e88ff826acc7c66e135344fb624b350690eeeed"} Dec 05 16:15:52 crc kubenswrapper[4778]: I1205 16:15:52.462851 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" podStartSLOduration=1.462816331 podStartE2EDuration="1.462816331s" podCreationTimestamp="2025-12-05 16:15:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:15:52.461460316 +0000 UTC m=+1239.565256696" watchObservedRunningTime="2025-12-05 16:15:52.462816331 +0000 UTC m=+1239.566612731" Dec 05 16:16:01 crc kubenswrapper[4778]: I1205 16:16:01.671463 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5dd64f5b5b-rmlwm" Dec 05 16:16:01 crc kubenswrapper[4778]: I1205 16:16:01.739171 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w"] Dec 05 16:16:01 crc kubenswrapper[4778]: I1205 16:16:01.739578 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" podUID="50119f5c-0d94-4c41-94bb-e335b6c8614a" containerName="manager" containerID="cri-o://0d576638bb13d5cef960cf958e6ec389e1b9048b1b42747bb5263ff9bd3fb83d" gracePeriod=10 Dec 05 16:16:02 crc kubenswrapper[4778]: I1205 16:16:02.523277 4778 generic.go:334] "Generic (PLEG): container finished" podID="50119f5c-0d94-4c41-94bb-e335b6c8614a" containerID="0d576638bb13d5cef960cf958e6ec389e1b9048b1b42747bb5263ff9bd3fb83d" exitCode=0 Dec 05 16:16:02 crc kubenswrapper[4778]: I1205 16:16:02.523348 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" event={"ID":"50119f5c-0d94-4c41-94bb-e335b6c8614a","Type":"ContainerDied","Data":"0d576638bb13d5cef960cf958e6ec389e1b9048b1b42747bb5263ff9bd3fb83d"} Dec 05 16:16:02 crc kubenswrapper[4778]: I1205 16:16:02.721141 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:16:02 crc kubenswrapper[4778]: I1205 16:16:02.868670 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50119f5c-0d94-4c41-94bb-e335b6c8614a-apiservice-cert\") pod \"50119f5c-0d94-4c41-94bb-e335b6c8614a\" (UID: \"50119f5c-0d94-4c41-94bb-e335b6c8614a\") " Dec 05 16:16:02 crc kubenswrapper[4778]: I1205 16:16:02.868874 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mvml\" (UniqueName: \"kubernetes.io/projected/50119f5c-0d94-4c41-94bb-e335b6c8614a-kube-api-access-8mvml\") pod \"50119f5c-0d94-4c41-94bb-e335b6c8614a\" (UID: \"50119f5c-0d94-4c41-94bb-e335b6c8614a\") " Dec 05 16:16:02 crc kubenswrapper[4778]: I1205 16:16:02.868960 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50119f5c-0d94-4c41-94bb-e335b6c8614a-webhook-cert\") pod \"50119f5c-0d94-4c41-94bb-e335b6c8614a\" (UID: \"50119f5c-0d94-4c41-94bb-e335b6c8614a\") " Dec 05 16:16:02 crc kubenswrapper[4778]: I1205 16:16:02.875003 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50119f5c-0d94-4c41-94bb-e335b6c8614a-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "50119f5c-0d94-4c41-94bb-e335b6c8614a" (UID: "50119f5c-0d94-4c41-94bb-e335b6c8614a"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:16:02 crc kubenswrapper[4778]: I1205 16:16:02.875287 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50119f5c-0d94-4c41-94bb-e335b6c8614a-kube-api-access-8mvml" (OuterVolumeSpecName: "kube-api-access-8mvml") pod "50119f5c-0d94-4c41-94bb-e335b6c8614a" (UID: "50119f5c-0d94-4c41-94bb-e335b6c8614a"). InnerVolumeSpecName "kube-api-access-8mvml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:16:02 crc kubenswrapper[4778]: I1205 16:16:02.875528 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50119f5c-0d94-4c41-94bb-e335b6c8614a-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "50119f5c-0d94-4c41-94bb-e335b6c8614a" (UID: "50119f5c-0d94-4c41-94bb-e335b6c8614a"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:16:02 crc kubenswrapper[4778]: I1205 16:16:02.970949 4778 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50119f5c-0d94-4c41-94bb-e335b6c8614a-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:16:02 crc kubenswrapper[4778]: I1205 16:16:02.970998 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mvml\" (UniqueName: \"kubernetes.io/projected/50119f5c-0d94-4c41-94bb-e335b6c8614a-kube-api-access-8mvml\") on node \"crc\" DevicePath \"\"" Dec 05 16:16:02 crc kubenswrapper[4778]: I1205 16:16:02.971014 4778 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50119f5c-0d94-4c41-94bb-e335b6c8614a-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:16:03 crc kubenswrapper[4778]: I1205 16:16:03.531955 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" event={"ID":"50119f5c-0d94-4c41-94bb-e335b6c8614a","Type":"ContainerDied","Data":"e69e7e8a9b43e022d7ef092be09054349c7cd41e861b3c517728736a1db58ba3"} Dec 05 16:16:03 crc kubenswrapper[4778]: I1205 16:16:03.532006 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w" Dec 05 16:16:03 crc kubenswrapper[4778]: I1205 16:16:03.532010 4778 scope.go:117] "RemoveContainer" containerID="0d576638bb13d5cef960cf958e6ec389e1b9048b1b42747bb5263ff9bd3fb83d" Dec 05 16:16:03 crc kubenswrapper[4778]: I1205 16:16:03.558622 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w"] Dec 05 16:16:03 crc kubenswrapper[4778]: I1205 16:16:03.566511 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8558ff9844-wgz5w"] Dec 05 16:16:05 crc kubenswrapper[4778]: I1205 16:16:05.333650 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50119f5c-0d94-4c41-94bb-e335b6c8614a" path="/var/lib/kubelet/pods/50119f5c-0d94-4c41-94bb-e335b6c8614a/volumes" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.003047 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Dec 05 16:16:15 crc kubenswrapper[4778]: E1205 16:16:15.003950 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50119f5c-0d94-4c41-94bb-e335b6c8614a" containerName="manager" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.003970 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="50119f5c-0d94-4c41-94bb-e335b6c8614a" containerName="manager" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.004206 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="50119f5c-0d94-4c41-94bb-e335b6c8614a" containerName="manager" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.005068 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.008479 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-config-data" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.008685 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"kube-root-ca.crt" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.008816 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-server-dockercfg-w2vcc" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.012684 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-rabbitmq-svc" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.013502 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-server-conf" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.013885 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openshift-service-ca.crt" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.013910 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-default-user" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.019822 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-erlang-cookie" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.019820 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-plugins-conf" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.023519 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.045322 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/daf89267-199f-4532-b4f7-a74fc2ef5425-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.045646 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/daf89267-199f-4532-b4f7-a74fc2ef5425-pod-info\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.045767 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/daf89267-199f-4532-b4f7-a74fc2ef5425-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.045946 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhwpv\" (UniqueName: \"kubernetes.io/projected/daf89267-199f-4532-b4f7-a74fc2ef5425-kube-api-access-rhwpv\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.046086 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/daf89267-199f-4532-b4f7-a74fc2ef5425-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.046198 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/daf89267-199f-4532-b4f7-a74fc2ef5425-server-conf\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.046324 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/daf89267-199f-4532-b4f7-a74fc2ef5425-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.046459 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f68216b0-ecc6-40f2-ba58-242385965b86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f68216b0-ecc6-40f2-ba58-242385965b86\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.046558 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/daf89267-199f-4532-b4f7-a74fc2ef5425-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.046789 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/daf89267-199f-4532-b4f7-a74fc2ef5425-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.046872 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/daf89267-199f-4532-b4f7-a74fc2ef5425-config-data\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.147898 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/daf89267-199f-4532-b4f7-a74fc2ef5425-pod-info\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.147943 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/daf89267-199f-4532-b4f7-a74fc2ef5425-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.147968 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhwpv\" (UniqueName: \"kubernetes.io/projected/daf89267-199f-4532-b4f7-a74fc2ef5425-kube-api-access-rhwpv\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.148002 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/daf89267-199f-4532-b4f7-a74fc2ef5425-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.148020 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/daf89267-199f-4532-b4f7-a74fc2ef5425-server-conf\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.148040 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/daf89267-199f-4532-b4f7-a74fc2ef5425-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.148061 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f68216b0-ecc6-40f2-ba58-242385965b86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f68216b0-ecc6-40f2-ba58-242385965b86\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.148081 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/daf89267-199f-4532-b4f7-a74fc2ef5425-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.148129 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/daf89267-199f-4532-b4f7-a74fc2ef5425-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.148151 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/daf89267-199f-4532-b4f7-a74fc2ef5425-config-data\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.148168 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/daf89267-199f-4532-b4f7-a74fc2ef5425-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.149393 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/daf89267-199f-4532-b4f7-a74fc2ef5425-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.149397 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/daf89267-199f-4532-b4f7-a74fc2ef5425-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.149828 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/daf89267-199f-4532-b4f7-a74fc2ef5425-config-data\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.150051 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/daf89267-199f-4532-b4f7-a74fc2ef5425-server-conf\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.150357 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/daf89267-199f-4532-b4f7-a74fc2ef5425-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.156426 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/daf89267-199f-4532-b4f7-a74fc2ef5425-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.158253 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/daf89267-199f-4532-b4f7-a74fc2ef5425-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.161541 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/daf89267-199f-4532-b4f7-a74fc2ef5425-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.162840 4778 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.162906 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f68216b0-ecc6-40f2-ba58-242385965b86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f68216b0-ecc6-40f2-ba58-242385965b86\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/08f3b3a259e90782e435e670a7f92356dc5a9a43f733fec4208a4d7304f0c1d7/globalmount\"" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.162858 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/daf89267-199f-4532-b4f7-a74fc2ef5425-pod-info\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.166822 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhwpv\" (UniqueName: \"kubernetes.io/projected/daf89267-199f-4532-b4f7-a74fc2ef5425-kube-api-access-rhwpv\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.192613 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f68216b0-ecc6-40f2-ba58-242385965b86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f68216b0-ecc6-40f2-ba58-242385965b86\") pod \"rabbitmq-server-0\" (UID: \"daf89267-199f-4532-b4f7-a74fc2ef5425\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.266535 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.268274 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.271136 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-server-dockercfg-v47nd" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.271437 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-default-user" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.271643 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-config-data" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.271805 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-server-conf" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.272479 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-rabbitmq-notifications-svc" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.272618 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-erlang-cookie" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.272637 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-plugins-conf" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.274929 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.321740 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.452728 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f73e1f56-f326-4886-9a0d-8f72407ebeb6-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.452794 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzpsh\" (UniqueName: \"kubernetes.io/projected/f73e1f56-f326-4886-9a0d-8f72407ebeb6-kube-api-access-vzpsh\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.452831 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f73e1f56-f326-4886-9a0d-8f72407ebeb6-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.452858 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f73e1f56-f326-4886-9a0d-8f72407ebeb6-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.452891 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f73e1f56-f326-4886-9a0d-8f72407ebeb6-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.452926 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-49259f39-2d2e-427b-8e81-63b2e306db33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49259f39-2d2e-427b-8e81-63b2e306db33\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.452949 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f73e1f56-f326-4886-9a0d-8f72407ebeb6-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.452977 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f73e1f56-f326-4886-9a0d-8f72407ebeb6-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.453004 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f73e1f56-f326-4886-9a0d-8f72407ebeb6-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.453026 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f73e1f56-f326-4886-9a0d-8f72407ebeb6-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.453052 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f73e1f56-f326-4886-9a0d-8f72407ebeb6-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.554199 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f73e1f56-f326-4886-9a0d-8f72407ebeb6-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.554573 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f73e1f56-f326-4886-9a0d-8f72407ebeb6-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.554596 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f73e1f56-f326-4886-9a0d-8f72407ebeb6-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.554623 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f73e1f56-f326-4886-9a0d-8f72407ebeb6-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.554687 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f73e1f56-f326-4886-9a0d-8f72407ebeb6-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.554716 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzpsh\" (UniqueName: \"kubernetes.io/projected/f73e1f56-f326-4886-9a0d-8f72407ebeb6-kube-api-access-vzpsh\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.554746 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f73e1f56-f326-4886-9a0d-8f72407ebeb6-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.554770 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f73e1f56-f326-4886-9a0d-8f72407ebeb6-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.554805 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f73e1f56-f326-4886-9a0d-8f72407ebeb6-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.554836 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-49259f39-2d2e-427b-8e81-63b2e306db33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49259f39-2d2e-427b-8e81-63b2e306db33\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.554854 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f73e1f56-f326-4886-9a0d-8f72407ebeb6-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.556207 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f73e1f56-f326-4886-9a0d-8f72407ebeb6-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.562729 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f73e1f56-f326-4886-9a0d-8f72407ebeb6-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.575240 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f73e1f56-f326-4886-9a0d-8f72407ebeb6-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.576658 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f73e1f56-f326-4886-9a0d-8f72407ebeb6-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.577283 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f73e1f56-f326-4886-9a0d-8f72407ebeb6-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.583138 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f73e1f56-f326-4886-9a0d-8f72407ebeb6-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.586606 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f73e1f56-f326-4886-9a0d-8f72407ebeb6-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.600596 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f73e1f56-f326-4886-9a0d-8f72407ebeb6-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.646265 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f73e1f56-f326-4886-9a0d-8f72407ebeb6-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.676281 4778 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.676319 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-49259f39-2d2e-427b-8e81-63b2e306db33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49259f39-2d2e-427b-8e81-63b2e306db33\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b03c8d1c04b80fb0611c83e1de6861ae33a0de351a3a4859cd6145d03b8d70c3/globalmount\"" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.682328 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzpsh\" (UniqueName: \"kubernetes.io/projected/f73e1f56-f326-4886-9a0d-8f72407ebeb6-kube-api-access-vzpsh\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.726479 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-49259f39-2d2e-427b-8e81-63b2e306db33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49259f39-2d2e-427b-8e81-63b2e306db33\") pod \"rabbitmq-notifications-server-0\" (UID: \"f73e1f56-f326-4886-9a0d-8f72407ebeb6\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:15 crc kubenswrapper[4778]: I1205 16:16:15.931480 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.071225 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Dec 05 16:16:16 crc kubenswrapper[4778]: W1205 16:16:16.081506 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaf89267_199f_4532_b4f7_a74fc2ef5425.slice/crio-88371945e01c47a0f063940f2b9272e05efdc35c1b0e057268976da2d86b8989 WatchSource:0}: Error finding container 88371945e01c47a0f063940f2b9272e05efdc35c1b0e057268976da2d86b8989: Status 404 returned error can't find the container with id 88371945e01c47a0f063940f2b9272e05efdc35c1b0e057268976da2d86b8989 Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.427503 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.722830 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"daf89267-199f-4532-b4f7-a74fc2ef5425","Type":"ContainerStarted","Data":"88371945e01c47a0f063940f2b9272e05efdc35c1b0e057268976da2d86b8989"} Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.725283 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"f73e1f56-f326-4886-9a0d-8f72407ebeb6","Type":"ContainerStarted","Data":"9a902ed280d26f65bc12ae69d7ecf23e7df721ea12a468ba98eff64d906d88a8"} Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.803023 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.813614 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.816455 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-scripts" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.816661 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-config-data" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.820519 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.824858 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-galera-openstack-svc" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.828435 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"combined-ca-bundle" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.830843 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"galera-openstack-dockercfg-ntdtv" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.896501 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.898303 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.900203 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"memcached-config-data" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.906608 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"memcached-memcached-dockercfg-grfmt" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.906889 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-svc" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.907543 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e6c7d544-fb70-43d0-a77f-db48a7a63582-kolla-config\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.907652 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e6c7d544-fb70-43d0-a77f-db48a7a63582-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.907735 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e6c7d544-fb70-43d0-a77f-db48a7a63582-config-data-default\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.907791 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6c7d544-fb70-43d0-a77f-db48a7a63582-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.907841 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg8l6\" (UniqueName: \"kubernetes.io/projected/e6c7d544-fb70-43d0-a77f-db48a7a63582-kube-api-access-vg8l6\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.907867 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-48d18835-d508-4e05-9437-87c9c7766e1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48d18835-d508-4e05-9437-87c9c7766e1d\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.907904 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c7d544-fb70-43d0-a77f-db48a7a63582-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.907963 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6c7d544-fb70-43d0-a77f-db48a7a63582-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:16 crc kubenswrapper[4778]: I1205 16:16:16.918059 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.011763 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6c7d544-fb70-43d0-a77f-db48a7a63582-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.011835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e6c7d544-fb70-43d0-a77f-db48a7a63582-kolla-config\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.011884 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9hdk\" (UniqueName: \"kubernetes.io/projected/859e60ea-04a4-49b6-8d50-6268c41f8131-kube-api-access-h9hdk\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.011904 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e6c7d544-fb70-43d0-a77f-db48a7a63582-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.011927 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e6c7d544-fb70-43d0-a77f-db48a7a63582-config-data-default\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.011953 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6c7d544-fb70-43d0-a77f-db48a7a63582-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.011974 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/859e60ea-04a4-49b6-8d50-6268c41f8131-memcached-tls-certs\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.012002 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg8l6\" (UniqueName: \"kubernetes.io/projected/e6c7d544-fb70-43d0-a77f-db48a7a63582-kube-api-access-vg8l6\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.012018 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/859e60ea-04a4-49b6-8d50-6268c41f8131-kolla-config\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.012042 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-48d18835-d508-4e05-9437-87c9c7766e1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48d18835-d508-4e05-9437-87c9c7766e1d\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.012061 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/859e60ea-04a4-49b6-8d50-6268c41f8131-config-data\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.012085 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c7d544-fb70-43d0-a77f-db48a7a63582-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.012116 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/859e60ea-04a4-49b6-8d50-6268c41f8131-combined-ca-bundle\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.013620 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e6c7d544-fb70-43d0-a77f-db48a7a63582-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.014318 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e6c7d544-fb70-43d0-a77f-db48a7a63582-kolla-config\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.014341 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e6c7d544-fb70-43d0-a77f-db48a7a63582-config-data-default\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.014617 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6c7d544-fb70-43d0-a77f-db48a7a63582-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.026329 4778 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.026359 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-48d18835-d508-4e05-9437-87c9c7766e1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48d18835-d508-4e05-9437-87c9c7766e1d\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/278c33fc0e5290579ab7f92078b84f4aa4fc0eb9b0c772ad549378bf5fc8ee96/globalmount\"" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.043395 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg8l6\" (UniqueName: \"kubernetes.io/projected/e6c7d544-fb70-43d0-a77f-db48a7a63582-kube-api-access-vg8l6\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.048844 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c7d544-fb70-43d0-a77f-db48a7a63582-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.056052 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6c7d544-fb70-43d0-a77f-db48a7a63582-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.083883 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-48d18835-d508-4e05-9437-87c9c7766e1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48d18835-d508-4e05-9437-87c9c7766e1d\") pod \"openstack-galera-0\" (UID: \"e6c7d544-fb70-43d0-a77f-db48a7a63582\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.113223 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9hdk\" (UniqueName: \"kubernetes.io/projected/859e60ea-04a4-49b6-8d50-6268c41f8131-kube-api-access-h9hdk\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.113283 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/859e60ea-04a4-49b6-8d50-6268c41f8131-memcached-tls-certs\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.113307 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/859e60ea-04a4-49b6-8d50-6268c41f8131-kolla-config\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.113341 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/859e60ea-04a4-49b6-8d50-6268c41f8131-config-data\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.113390 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/859e60ea-04a4-49b6-8d50-6268c41f8131-combined-ca-bundle\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.114226 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/859e60ea-04a4-49b6-8d50-6268c41f8131-kolla-config\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.114289 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/859e60ea-04a4-49b6-8d50-6268c41f8131-config-data\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.117748 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/859e60ea-04a4-49b6-8d50-6268c41f8131-memcached-tls-certs\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.131145 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9hdk\" (UniqueName: \"kubernetes.io/projected/859e60ea-04a4-49b6-8d50-6268c41f8131-kube-api-access-h9hdk\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.136773 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/859e60ea-04a4-49b6-8d50-6268c41f8131-combined-ca-bundle\") pod \"memcached-0\" (UID: \"859e60ea-04a4-49b6-8d50-6268c41f8131\") " pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.149531 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.215385 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.215389 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.216751 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.228168 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"telemetry-ceilometer-dockercfg-rphr2" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.317809 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw9hg\" (UniqueName: \"kubernetes.io/projected/01b4e1d6-1a7d-4113-8954-278cfe2d60c3-kube-api-access-hw9hg\") pod \"kube-state-metrics-0\" (UID: \"01b4e1d6-1a7d-4113-8954-278cfe2d60c3\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.317871 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.419749 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw9hg\" (UniqueName: \"kubernetes.io/projected/01b4e1d6-1a7d-4113-8954-278cfe2d60c3-kube-api-access-hw9hg\") pod \"kube-state-metrics-0\" (UID: \"01b4e1d6-1a7d-4113-8954-278cfe2d60c3\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.459202 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw9hg\" (UniqueName: \"kubernetes.io/projected/01b4e1d6-1a7d-4113-8954-278cfe2d60c3-kube-api-access-hw9hg\") pod \"kube-state-metrics-0\" (UID: \"01b4e1d6-1a7d-4113-8954-278cfe2d60c3\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:16:17 crc kubenswrapper[4778]: I1205 16:16:17.669408 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.034090 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.036727 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.044024 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-generated" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.044098 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-tls-assets-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.044290 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-cluster-tls-config" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.044397 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-web-config" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.044652 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-alertmanager-dockercfg-6d8q6" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.076695 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.107496 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.235911 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/02c25bd5-13b2-447c-ba37-5aadb8a33da0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.236288 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/02c25bd5-13b2-447c-ba37-5aadb8a33da0-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.236385 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/02c25bd5-13b2-447c-ba37-5aadb8a33da0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.236429 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/02c25bd5-13b2-447c-ba37-5aadb8a33da0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.236564 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/02c25bd5-13b2-447c-ba37-5aadb8a33da0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.236635 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tpkd\" (UniqueName: \"kubernetes.io/projected/02c25bd5-13b2-447c-ba37-5aadb8a33da0-kube-api-access-4tpkd\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.236711 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/02c25bd5-13b2-447c-ba37-5aadb8a33da0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.248884 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 05 16:16:18 crc kubenswrapper[4778]: W1205 16:16:18.262556 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod859e60ea_04a4_49b6_8d50_6268c41f8131.slice/crio-62ea0cd85c7793bb56960648cc8249a80cef48c01c4b07e720f190ce423b0586 WatchSource:0}: Error finding container 62ea0cd85c7793bb56960648cc8249a80cef48c01c4b07e720f190ce423b0586: Status 404 returned error can't find the container with id 62ea0cd85c7793bb56960648cc8249a80cef48c01c4b07e720f190ce423b0586 Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.338841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/02c25bd5-13b2-447c-ba37-5aadb8a33da0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.339049 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/02c25bd5-13b2-447c-ba37-5aadb8a33da0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.339078 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/02c25bd5-13b2-447c-ba37-5aadb8a33da0-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.339174 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/02c25bd5-13b2-447c-ba37-5aadb8a33da0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.339209 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/02c25bd5-13b2-447c-ba37-5aadb8a33da0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.339279 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/02c25bd5-13b2-447c-ba37-5aadb8a33da0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.339376 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tpkd\" (UniqueName: \"kubernetes.io/projected/02c25bd5-13b2-447c-ba37-5aadb8a33da0-kube-api-access-4tpkd\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.341130 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/02c25bd5-13b2-447c-ba37-5aadb8a33da0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.351214 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/02c25bd5-13b2-447c-ba37-5aadb8a33da0-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.351757 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/02c25bd5-13b2-447c-ba37-5aadb8a33da0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.353772 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/02c25bd5-13b2-447c-ba37-5aadb8a33da0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.357547 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tpkd\" (UniqueName: \"kubernetes.io/projected/02c25bd5-13b2-447c-ba37-5aadb8a33da0-kube-api-access-4tpkd\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.359548 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/02c25bd5-13b2-447c-ba37-5aadb8a33da0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.367462 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/02c25bd5-13b2-447c-ba37-5aadb8a33da0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"02c25bd5-13b2-447c-ba37-5aadb8a33da0\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.398569 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.442785 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb"] Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.452537 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.460172 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.460810 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-79g5x" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.474939 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb"] Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.546310 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw88q\" (UniqueName: \"kubernetes.io/projected/66d01458-bc30-463f-b7e3-6e20bb4ea267-kube-api-access-bw88q\") pod \"observability-ui-dashboards-7d5fb4cbfb-lk9wb\" (UID: \"66d01458-bc30-463f-b7e3-6e20bb4ea267\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.546379 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66d01458-bc30-463f-b7e3-6e20bb4ea267-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-lk9wb\" (UID: \"66d01458-bc30-463f-b7e3-6e20bb4ea267\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.649461 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw88q\" (UniqueName: \"kubernetes.io/projected/66d01458-bc30-463f-b7e3-6e20bb4ea267-kube-api-access-bw88q\") pod \"observability-ui-dashboards-7d5fb4cbfb-lk9wb\" (UID: \"66d01458-bc30-463f-b7e3-6e20bb4ea267\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.649506 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66d01458-bc30-463f-b7e3-6e20bb4ea267-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-lk9wb\" (UID: \"66d01458-bc30-463f-b7e3-6e20bb4ea267\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb" Dec 05 16:16:18 crc kubenswrapper[4778]: E1205 16:16:18.649652 4778 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Dec 05 16:16:18 crc kubenswrapper[4778]: E1205 16:16:18.649713 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d01458-bc30-463f-b7e3-6e20bb4ea267-serving-cert podName:66d01458-bc30-463f-b7e3-6e20bb4ea267 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:19.149696191 +0000 UTC m=+1266.253492571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/66d01458-bc30-463f-b7e3-6e20bb4ea267-serving-cert") pod "observability-ui-dashboards-7d5fb4cbfb-lk9wb" (UID: "66d01458-bc30-463f-b7e3-6e20bb4ea267") : secret "observability-ui-dashboards" not found Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.664990 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.673306 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw88q\" (UniqueName: \"kubernetes.io/projected/66d01458-bc30-463f-b7e3-6e20bb4ea267-kube-api-access-bw88q\") pod \"observability-ui-dashboards-7d5fb4cbfb-lk9wb\" (UID: \"66d01458-bc30-463f-b7e3-6e20bb4ea267\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.688155 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.694112 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.697908 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.698203 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.698768 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-tls-assets-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.699014 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.699121 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-web-config" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.699191 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-prometheus-dockercfg-m9tj2" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.712350 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.792995 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"e6c7d544-fb70-43d0-a77f-db48a7a63582","Type":"ContainerStarted","Data":"65e083bda48d85d7090e9e0a31b207b1c50226f220168d87aaa36116d93b923a"} Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.810530 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-865ffdf9b9-69jc8"] Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.821703 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"01b4e1d6-1a7d-4113-8954-278cfe2d60c3","Type":"ContainerStarted","Data":"fe7d71a7226f273fb87fe7bdfd3890b738940c36456ded2b9115c90710544894"} Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.821745 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"859e60ea-04a4-49b6-8d50-6268c41f8131","Type":"ContainerStarted","Data":"62ea0cd85c7793bb56960648cc8249a80cef48c01c4b07e720f190ce423b0586"} Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.821817 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.837271 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-865ffdf9b9-69jc8"] Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.853185 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/966c547a-2be8-4f17-8b73-6a6904e6d6ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.853257 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.853292 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.853317 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.853331 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnhtg\" (UniqueName: \"kubernetes.io/projected/966c547a-2be8-4f17-8b73-6a6904e6d6ef-kube-api-access-bnhtg\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.853356 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/966c547a-2be8-4f17-8b73-6a6904e6d6ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.853392 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.853415 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/966c547a-2be8-4f17-8b73-6a6904e6d6ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.954951 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/966c547a-2be8-4f17-8b73-6a6904e6d6ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955017 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-service-ca\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955037 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bg4j\" (UniqueName: \"kubernetes.io/projected/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-kube-api-access-9bg4j\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955072 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955092 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-oauth-serving-cert\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955122 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955158 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955179 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnhtg\" (UniqueName: \"kubernetes.io/projected/966c547a-2be8-4f17-8b73-6a6904e6d6ef-kube-api-access-bnhtg\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955207 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/966c547a-2be8-4f17-8b73-6a6904e6d6ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955234 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-console-config\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955257 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955277 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-console-serving-cert\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955293 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-console-oauth-config\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955311 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/966c547a-2be8-4f17-8b73-6a6904e6d6ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.955332 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-trusted-ca-bundle\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.960257 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.960618 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/966c547a-2be8-4f17-8b73-6a6904e6d6ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.961029 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/966c547a-2be8-4f17-8b73-6a6904e6d6ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.963983 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.967164 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.967330 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/966c547a-2be8-4f17-8b73-6a6904e6d6ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.991727 4778 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 16:16:18 crc kubenswrapper[4778]: I1205 16:16:18.991793 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cb4b5983a897e7197d99054548cb4b72c2c3c3eece65a6205a52532f13a56352/globalmount\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.004259 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnhtg\" (UniqueName: \"kubernetes.io/projected/966c547a-2be8-4f17-8b73-6a6904e6d6ef-kube-api-access-bnhtg\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.047952 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\") pod \"prometheus-metric-storage-0\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.056344 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-console-config\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.056411 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-console-serving-cert\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.056429 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-console-oauth-config\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.056488 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-trusted-ca-bundle\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.056567 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-service-ca\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.056584 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bg4j\" (UniqueName: \"kubernetes.io/projected/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-kube-api-access-9bg4j\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.056637 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-oauth-serving-cert\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.059152 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-service-ca\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.059877 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-trusted-ca-bundle\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.060739 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-oauth-serving-cert\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.062927 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-console-config\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.065279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-console-oauth-config\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.067343 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-console-serving-cert\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.074171 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bg4j\" (UniqueName: \"kubernetes.io/projected/a94b1f3e-2fe1-4efd-b77e-56de66d4ea69-kube-api-access-9bg4j\") pod \"console-865ffdf9b9-69jc8\" (UID: \"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69\") " pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.157974 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.167497 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66d01458-bc30-463f-b7e3-6e20bb4ea267-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-lk9wb\" (UID: \"66d01458-bc30-463f-b7e3-6e20bb4ea267\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.171635 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66d01458-bc30-463f-b7e3-6e20bb4ea267-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-lk9wb\" (UID: \"66d01458-bc30-463f-b7e3-6e20bb4ea267\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.271482 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.324318 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.429753 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb" Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.698781 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-865ffdf9b9-69jc8"] Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.825551 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"02c25bd5-13b2-447c-ba37-5aadb8a33da0","Type":"ContainerStarted","Data":"027782bb45a636bd0822e4c302032a305f4bb1274e2cd1de11423405def3e379"} Dec 05 16:16:19 crc kubenswrapper[4778]: I1205 16:16:19.826826 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-865ffdf9b9-69jc8" event={"ID":"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69","Type":"ContainerStarted","Data":"9dec539a21cb1fdef0e0c27570ae858a9a0f3a189608cd5561cf5a612efad55d"} Dec 05 16:16:20 crc kubenswrapper[4778]: I1205 16:16:20.006937 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 16:16:20 crc kubenswrapper[4778]: I1205 16:16:20.089504 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb"] Dec 05 16:16:20 crc kubenswrapper[4778]: W1205 16:16:20.144202 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod966c547a_2be8_4f17_8b73_6a6904e6d6ef.slice/crio-43ad9fb80a7ea7a0e9ce6e2ff28dfc014305e99f0edb2ab7fd1420e553825660 WatchSource:0}: Error finding container 43ad9fb80a7ea7a0e9ce6e2ff28dfc014305e99f0edb2ab7fd1420e553825660: Status 404 returned error can't find the container with id 43ad9fb80a7ea7a0e9ce6e2ff28dfc014305e99f0edb2ab7fd1420e553825660 Dec 05 16:16:20 crc kubenswrapper[4778]: W1205 16:16:20.161322 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66d01458_bc30_463f_b7e3_6e20bb4ea267.slice/crio-6fb7d7d792e5766537afbd1969331bb0dea02171b6097f07a6e73a3eba480693 WatchSource:0}: Error finding container 6fb7d7d792e5766537afbd1969331bb0dea02171b6097f07a6e73a3eba480693: Status 404 returned error can't find the container with id 6fb7d7d792e5766537afbd1969331bb0dea02171b6097f07a6e73a3eba480693 Dec 05 16:16:20 crc kubenswrapper[4778]: I1205 16:16:20.838833 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"966c547a-2be8-4f17-8b73-6a6904e6d6ef","Type":"ContainerStarted","Data":"43ad9fb80a7ea7a0e9ce6e2ff28dfc014305e99f0edb2ab7fd1420e553825660"} Dec 05 16:16:20 crc kubenswrapper[4778]: I1205 16:16:20.841566 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-865ffdf9b9-69jc8" event={"ID":"a94b1f3e-2fe1-4efd-b77e-56de66d4ea69","Type":"ContainerStarted","Data":"d0b3f689cb6a9656046ecc1485d8c8c3591a0757ed120e358a14f08fe81a53ba"} Dec 05 16:16:20 crc kubenswrapper[4778]: I1205 16:16:20.843674 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb" event={"ID":"66d01458-bc30-463f-b7e3-6e20bb4ea267","Type":"ContainerStarted","Data":"6fb7d7d792e5766537afbd1969331bb0dea02171b6097f07a6e73a3eba480693"} Dec 05 16:16:20 crc kubenswrapper[4778]: I1205 16:16:20.864814 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-865ffdf9b9-69jc8" podStartSLOduration=2.864787272 podStartE2EDuration="2.864787272s" podCreationTimestamp="2025-12-05 16:16:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:16:20.859248575 +0000 UTC m=+1267.963044955" watchObservedRunningTime="2025-12-05 16:16:20.864787272 +0000 UTC m=+1267.968583642" Dec 05 16:16:29 crc kubenswrapper[4778]: I1205 16:16:29.159192 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:29 crc kubenswrapper[4778]: I1205 16:16:29.159722 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:29 crc kubenswrapper[4778]: I1205 16:16:29.163181 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:29 crc kubenswrapper[4778]: I1205 16:16:29.926622 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-865ffdf9b9-69jc8" Dec 05 16:16:30 crc kubenswrapper[4778]: I1205 16:16:30.002742 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78776b8fbf-hfsq2"] Dec 05 16:16:33 crc kubenswrapper[4778]: I1205 16:16:33.414746 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:16:33 crc kubenswrapper[4778]: I1205 16:16:33.414841 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:16:33 crc kubenswrapper[4778]: E1205 16:16:33.551426 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 05 16:16:33 crc kubenswrapper[4778]: E1205 16:16:33.551882 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rhwpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_watcher-kuttl-default(daf89267-199f-4532-b4f7-a74fc2ef5425): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:16:33 crc kubenswrapper[4778]: E1205 16:16:33.553119 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/rabbitmq-server-0" podUID="daf89267-199f-4532-b4f7-a74fc2ef5425" Dec 05 16:16:33 crc kubenswrapper[4778]: E1205 16:16:33.955498 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="watcher-kuttl-default/rabbitmq-server-0" podUID="daf89267-199f-4532-b4f7-a74fc2ef5425" Dec 05 16:16:34 crc kubenswrapper[4778]: E1205 16:16:34.414198 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 05 16:16:34 crc kubenswrapper[4778]: E1205 16:16:34.414422 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:nc9h5d6h5b4h64h599h5c8hfbh699h54ch5d8h55dh8fh89h5ffh5f4hd4h675h55fh67dh6fhffh685h9ch54ch58fhf8h86hbbh695h674h66fhdbq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h9hdk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_watcher-kuttl-default(859e60ea-04a4-49b6-8d50-6268c41f8131): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:16:34 crc kubenswrapper[4778]: E1205 16:16:34.415795 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/memcached-0" podUID="859e60ea-04a4-49b6-8d50-6268c41f8131" Dec 05 16:16:34 crc kubenswrapper[4778]: E1205 16:16:34.975059 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="watcher-kuttl-default/memcached-0" podUID="859e60ea-04a4-49b6-8d50-6268c41f8131" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.184121 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.184596 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vg8l6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_watcher-kuttl-default(e6c7d544-fb70-43d0-a77f-db48a7a63582): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.185948 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/openstack-galera-0" podUID="e6c7d544-fb70-43d0-a77f-db48a7a63582" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.713490 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.713546 4778 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.713684 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=watcher-kuttl-default],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hw9hg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_watcher-kuttl-default(01b4e1d6-1a7d-4113-8954-278cfe2d60c3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.715355 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="01b4e1d6-1a7d-4113-8954-278cfe2d60c3" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.750297 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.750472 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vzpsh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-notifications-server-0_watcher-kuttl-default(f73e1f56-f326-4886-9a0d-8f72407ebeb6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.752455 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" podUID="f73e1f56-f326-4886-9a0d-8f72407ebeb6" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.978239 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" podUID="f73e1f56-f326-4886-9a0d-8f72407ebeb6" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.978239 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="01b4e1d6-1a7d-4113-8954-278cfe2d60c3" Dec 05 16:16:36 crc kubenswrapper[4778]: E1205 16:16:36.978316 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="watcher-kuttl-default/openstack-galera-0" podUID="e6c7d544-fb70-43d0-a77f-db48a7a63582" Dec 05 16:16:37 crc kubenswrapper[4778]: I1205 16:16:37.983382 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb" event={"ID":"66d01458-bc30-463f-b7e3-6e20bb4ea267","Type":"ContainerStarted","Data":"5d74001ab63a45576b09ad2ad64e6b4dad5d8f84956d78edc6055382504c2784"} Dec 05 16:16:38 crc kubenswrapper[4778]: I1205 16:16:38.000010 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-lk9wb" podStartSLOduration=4.01115981 podStartE2EDuration="19.99999134s" podCreationTimestamp="2025-12-05 16:16:18 +0000 UTC" firstStartedPulling="2025-12-05 16:16:20.163563175 +0000 UTC m=+1267.267359555" lastFinishedPulling="2025-12-05 16:16:36.152394705 +0000 UTC m=+1283.256191085" observedRunningTime="2025-12-05 16:16:37.997768492 +0000 UTC m=+1285.101564872" watchObservedRunningTime="2025-12-05 16:16:37.99999134 +0000 UTC m=+1285.103787720" Dec 05 16:16:39 crc kubenswrapper[4778]: I1205 16:16:39.998549 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"02c25bd5-13b2-447c-ba37-5aadb8a33da0","Type":"ContainerStarted","Data":"db45dba2785a1a01f0eb50234ca1ccc9d961a9ab14a41f7a4170318862ad531c"} Dec 05 16:16:40 crc kubenswrapper[4778]: I1205 16:16:40.000749 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"966c547a-2be8-4f17-8b73-6a6904e6d6ef","Type":"ContainerStarted","Data":"27c4f772336a6d6f5d0b4e945f22dbbfc1e55d42f9bbd8dcdac95f41db86583d"} Dec 05 16:16:48 crc kubenswrapper[4778]: I1205 16:16:48.068283 4778 generic.go:334] "Generic (PLEG): container finished" podID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerID="27c4f772336a6d6f5d0b4e945f22dbbfc1e55d42f9bbd8dcdac95f41db86583d" exitCode=0 Dec 05 16:16:48 crc kubenswrapper[4778]: I1205 16:16:48.068504 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"966c547a-2be8-4f17-8b73-6a6904e6d6ef","Type":"ContainerDied","Data":"27c4f772336a6d6f5d0b4e945f22dbbfc1e55d42f9bbd8dcdac95f41db86583d"} Dec 05 16:16:48 crc kubenswrapper[4778]: I1205 16:16:48.071749 4778 generic.go:334] "Generic (PLEG): container finished" podID="02c25bd5-13b2-447c-ba37-5aadb8a33da0" containerID="db45dba2785a1a01f0eb50234ca1ccc9d961a9ab14a41f7a4170318862ad531c" exitCode=0 Dec 05 16:16:48 crc kubenswrapper[4778]: I1205 16:16:48.071792 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"02c25bd5-13b2-447c-ba37-5aadb8a33da0","Type":"ContainerDied","Data":"db45dba2785a1a01f0eb50234ca1ccc9d961a9ab14a41f7a4170318862ad531c"} Dec 05 16:16:49 crc kubenswrapper[4778]: I1205 16:16:49.082295 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"daf89267-199f-4532-b4f7-a74fc2ef5425","Type":"ContainerStarted","Data":"e6f332fb9d6fdd90b3000588be004e4bac9f2726275eeadcccede6cff5e62788"} Dec 05 16:16:49 crc kubenswrapper[4778]: I1205 16:16:49.085343 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"e6c7d544-fb70-43d0-a77f-db48a7a63582","Type":"ContainerStarted","Data":"4a9d22e045b3d2aefab204f7f4e4af8827cab8265d46a2de70cb16c93ba62932"} Dec 05 16:16:50 crc kubenswrapper[4778]: I1205 16:16:50.093566 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"859e60ea-04a4-49b6-8d50-6268c41f8131","Type":"ContainerStarted","Data":"eeafa5a168621736a0641cc70776068d218b4f8aa799d40bb11302fdfe84b9b5"} Dec 05 16:16:50 crc kubenswrapper[4778]: I1205 16:16:50.094091 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:50 crc kubenswrapper[4778]: I1205 16:16:50.115724 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/memcached-0" podStartSLOduration=2.6866392489999997 podStartE2EDuration="34.115708043s" podCreationTimestamp="2025-12-05 16:16:16 +0000 UTC" firstStartedPulling="2025-12-05 16:16:18.271632885 +0000 UTC m=+1265.375429265" lastFinishedPulling="2025-12-05 16:16:49.700701679 +0000 UTC m=+1296.804498059" observedRunningTime="2025-12-05 16:16:50.111647216 +0000 UTC m=+1297.215443636" watchObservedRunningTime="2025-12-05 16:16:50.115708043 +0000 UTC m=+1297.219504423" Dec 05 16:16:51 crc kubenswrapper[4778]: I1205 16:16:51.103941 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"01b4e1d6-1a7d-4113-8954-278cfe2d60c3","Type":"ContainerStarted","Data":"9f9cc9f620d6ce4a31aae4d3bb7a095d64b1b1333ce9f316acde898ebd684071"} Dec 05 16:16:51 crc kubenswrapper[4778]: I1205 16:16:51.104429 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:16:51 crc kubenswrapper[4778]: I1205 16:16:51.106688 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"02c25bd5-13b2-447c-ba37-5aadb8a33da0","Type":"ContainerStarted","Data":"0e12c40f78017fd0cd271ae4f8ec489decdf9f9acd9f33449c653ee0ec5bcd18"} Dec 05 16:16:51 crc kubenswrapper[4778]: I1205 16:16:51.127780 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/kube-state-metrics-0" podStartSLOduration=1.932628966 podStartE2EDuration="34.127762385s" podCreationTimestamp="2025-12-05 16:16:17 +0000 UTC" firstStartedPulling="2025-12-05 16:16:18.463297302 +0000 UTC m=+1265.567093682" lastFinishedPulling="2025-12-05 16:16:50.658430721 +0000 UTC m=+1297.762227101" observedRunningTime="2025-12-05 16:16:51.122433653 +0000 UTC m=+1298.226230043" watchObservedRunningTime="2025-12-05 16:16:51.127762385 +0000 UTC m=+1298.231558765" Dec 05 16:16:55 crc kubenswrapper[4778]: I1205 16:16:55.066313 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-78776b8fbf-hfsq2" podUID="1f5e616a-22c3-400f-829a-21846578a9d0" containerName="console" containerID="cri-o://39df04f83ed5475f698c3ffc8cf4840b4f8831615c8b25b6129b11eeadc26769" gracePeriod=15 Dec 05 16:16:55 crc kubenswrapper[4778]: I1205 16:16:55.370395 4778 patch_prober.go:28] interesting pod/console-78776b8fbf-hfsq2 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/health\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Dec 05 16:16:55 crc kubenswrapper[4778]: I1205 16:16:55.370454 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-78776b8fbf-hfsq2" podUID="1f5e616a-22c3-400f-829a-21846578a9d0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.49:8443/health\": dial tcp 10.217.0.49:8443: connect: connection refused" Dec 05 16:16:56 crc kubenswrapper[4778]: I1205 16:16:56.149315 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78776b8fbf-hfsq2_1f5e616a-22c3-400f-829a-21846578a9d0/console/0.log" Dec 05 16:16:56 crc kubenswrapper[4778]: I1205 16:16:56.149667 4778 generic.go:334] "Generic (PLEG): container finished" podID="1f5e616a-22c3-400f-829a-21846578a9d0" containerID="39df04f83ed5475f698c3ffc8cf4840b4f8831615c8b25b6129b11eeadc26769" exitCode=2 Dec 05 16:16:56 crc kubenswrapper[4778]: I1205 16:16:56.149697 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78776b8fbf-hfsq2" event={"ID":"1f5e616a-22c3-400f-829a-21846578a9d0","Type":"ContainerDied","Data":"39df04f83ed5475f698c3ffc8cf4840b4f8831615c8b25b6129b11eeadc26769"} Dec 05 16:16:57 crc kubenswrapper[4778]: I1205 16:16:57.158437 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"f73e1f56-f326-4886-9a0d-8f72407ebeb6","Type":"ContainerStarted","Data":"23d2decd8cfc30b3efb75bc47a6044b70036e730721a95e844c61683dce67f7a"} Dec 05 16:16:57 crc kubenswrapper[4778]: I1205 16:16:57.160173 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"02c25bd5-13b2-447c-ba37-5aadb8a33da0","Type":"ContainerStarted","Data":"6c272fceabda6acaf72656a7f2b78b0edfbf3a8847936b3f6d19c546fe765b4f"} Dec 05 16:16:57 crc kubenswrapper[4778]: I1205 16:16:57.216322 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/memcached-0" Dec 05 16:16:57 crc kubenswrapper[4778]: I1205 16:16:57.674046 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:16:58 crc kubenswrapper[4778]: I1205 16:16:58.171296 4778 generic.go:334] "Generic (PLEG): container finished" podID="e6c7d544-fb70-43d0-a77f-db48a7a63582" containerID="4a9d22e045b3d2aefab204f7f4e4af8827cab8265d46a2de70cb16c93ba62932" exitCode=0 Dec 05 16:16:58 crc kubenswrapper[4778]: I1205 16:16:58.172930 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"e6c7d544-fb70-43d0-a77f-db48a7a63582","Type":"ContainerDied","Data":"4a9d22e045b3d2aefab204f7f4e4af8827cab8265d46a2de70cb16c93ba62932"} Dec 05 16:16:58 crc kubenswrapper[4778]: I1205 16:16:58.173238 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:58 crc kubenswrapper[4778]: I1205 16:16:58.177390 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 16:16:58 crc kubenswrapper[4778]: I1205 16:16:58.201937 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/alertmanager-metric-storage-0" podStartSLOduration=9.074768194 podStartE2EDuration="40.201919089s" podCreationTimestamp="2025-12-05 16:16:18 +0000 UTC" firstStartedPulling="2025-12-05 16:16:19.307753724 +0000 UTC m=+1266.411550104" lastFinishedPulling="2025-12-05 16:16:50.434904619 +0000 UTC m=+1297.538700999" observedRunningTime="2025-12-05 16:16:58.196907676 +0000 UTC m=+1305.300704086" watchObservedRunningTime="2025-12-05 16:16:58.201919089 +0000 UTC m=+1305.305715469" Dec 05 16:17:02 crc kubenswrapper[4778]: E1205 16:17:02.457531 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b" Dec 05 16:17:02 crc kubenswrapper[4778]: E1205 16:17:02.458629 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus,Image:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,Command:[],Args:[--config.file=/etc/prometheus/config_out/prometheus.env.yaml --web.enable-lifecycle --web.route-prefix=/ --storage.tsdb.retention.time=24h --storage.tsdb.path=/prometheus --web.config.file=/etc/prometheus/web_config/web-config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-out,ReadOnly:true,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-assets,ReadOnly:true,MountPath:/etc/prometheus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-db,ReadOnly:false,MountPath:/prometheus,SubPath:prometheus-db,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/prometheus/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bnhtg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/healthy,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_watcher-kuttl-default(966c547a-2be8-4f17-8b73-6a6904e6d6ef): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.560641 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78776b8fbf-hfsq2_1f5e616a-22c3-400f-829a-21846578a9d0/console/0.log" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.560721 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.722808 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f5e616a-22c3-400f-829a-21846578a9d0-console-serving-cert\") pod \"1f5e616a-22c3-400f-829a-21846578a9d0\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.723210 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-oauth-serving-cert\") pod \"1f5e616a-22c3-400f-829a-21846578a9d0\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.723239 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-service-ca\") pod \"1f5e616a-22c3-400f-829a-21846578a9d0\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.723265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-console-config\") pod \"1f5e616a-22c3-400f-829a-21846578a9d0\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.723360 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f5e616a-22c3-400f-829a-21846578a9d0-console-oauth-config\") pod \"1f5e616a-22c3-400f-829a-21846578a9d0\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.723453 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khhr7\" (UniqueName: \"kubernetes.io/projected/1f5e616a-22c3-400f-829a-21846578a9d0-kube-api-access-khhr7\") pod \"1f5e616a-22c3-400f-829a-21846578a9d0\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.723499 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-trusted-ca-bundle\") pod \"1f5e616a-22c3-400f-829a-21846578a9d0\" (UID: \"1f5e616a-22c3-400f-829a-21846578a9d0\") " Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.724682 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1f5e616a-22c3-400f-829a-21846578a9d0" (UID: "1f5e616a-22c3-400f-829a-21846578a9d0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.725187 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-console-config" (OuterVolumeSpecName: "console-config") pod "1f5e616a-22c3-400f-829a-21846578a9d0" (UID: "1f5e616a-22c3-400f-829a-21846578a9d0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.725380 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-service-ca" (OuterVolumeSpecName: "service-ca") pod "1f5e616a-22c3-400f-829a-21846578a9d0" (UID: "1f5e616a-22c3-400f-829a-21846578a9d0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.725504 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1f5e616a-22c3-400f-829a-21846578a9d0" (UID: "1f5e616a-22c3-400f-829a-21846578a9d0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.727026 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f5e616a-22c3-400f-829a-21846578a9d0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1f5e616a-22c3-400f-829a-21846578a9d0" (UID: "1f5e616a-22c3-400f-829a-21846578a9d0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.727963 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f5e616a-22c3-400f-829a-21846578a9d0-kube-api-access-khhr7" (OuterVolumeSpecName: "kube-api-access-khhr7") pod "1f5e616a-22c3-400f-829a-21846578a9d0" (UID: "1f5e616a-22c3-400f-829a-21846578a9d0"). InnerVolumeSpecName "kube-api-access-khhr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.728601 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f5e616a-22c3-400f-829a-21846578a9d0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1f5e616a-22c3-400f-829a-21846578a9d0" (UID: "1f5e616a-22c3-400f-829a-21846578a9d0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.825859 4778 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f5e616a-22c3-400f-829a-21846578a9d0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.825918 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khhr7\" (UniqueName: \"kubernetes.io/projected/1f5e616a-22c3-400f-829a-21846578a9d0-kube-api-access-khhr7\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.825934 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.825952 4778 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f5e616a-22c3-400f-829a-21846578a9d0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.825970 4778 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.825987 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:02 crc kubenswrapper[4778]: I1205 16:17:02.826002 4778 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f5e616a-22c3-400f-829a-21846578a9d0-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:03 crc kubenswrapper[4778]: I1205 16:17:03.210580 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"e6c7d544-fb70-43d0-a77f-db48a7a63582","Type":"ContainerStarted","Data":"9de88b37598dfdcafd4113b6b743c39561980e47e81ecb8b5504fe89c99c9eea"} Dec 05 16:17:03 crc kubenswrapper[4778]: I1205 16:17:03.212763 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78776b8fbf-hfsq2_1f5e616a-22c3-400f-829a-21846578a9d0/console/0.log" Dec 05 16:17:03 crc kubenswrapper[4778]: I1205 16:17:03.212853 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78776b8fbf-hfsq2" event={"ID":"1f5e616a-22c3-400f-829a-21846578a9d0","Type":"ContainerDied","Data":"38365e34566293d78897bcd2ed14e6b1c4865d36d675cfb5415608f0698e75c4"} Dec 05 16:17:03 crc kubenswrapper[4778]: I1205 16:17:03.212904 4778 scope.go:117] "RemoveContainer" containerID="39df04f83ed5475f698c3ffc8cf4840b4f8831615c8b25b6129b11eeadc26769" Dec 05 16:17:03 crc kubenswrapper[4778]: I1205 16:17:03.212981 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78776b8fbf-hfsq2" Dec 05 16:17:03 crc kubenswrapper[4778]: I1205 16:17:03.261702 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/openstack-galera-0" podStartSLOduration=17.697972451 podStartE2EDuration="48.261667969s" podCreationTimestamp="2025-12-05 16:16:15 +0000 UTC" firstStartedPulling="2025-12-05 16:16:18.123489581 +0000 UTC m=+1265.227285961" lastFinishedPulling="2025-12-05 16:16:48.687185099 +0000 UTC m=+1295.790981479" observedRunningTime="2025-12-05 16:17:03.248708776 +0000 UTC m=+1310.352505166" watchObservedRunningTime="2025-12-05 16:17:03.261667969 +0000 UTC m=+1310.365464389" Dec 05 16:17:03 crc kubenswrapper[4778]: I1205 16:17:03.290073 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78776b8fbf-hfsq2"] Dec 05 16:17:03 crc kubenswrapper[4778]: I1205 16:17:03.296096 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-78776b8fbf-hfsq2"] Dec 05 16:17:03 crc kubenswrapper[4778]: I1205 16:17:03.415100 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:17:03 crc kubenswrapper[4778]: I1205 16:17:03.415174 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:17:05 crc kubenswrapper[4778]: I1205 16:17:05.232087 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"966c547a-2be8-4f17-8b73-6a6904e6d6ef","Type":"ContainerStarted","Data":"af97a138be12ac772c50e6acda1ec7a6461e2f05f39e3abb7d2c5038b96be8ae"} Dec 05 16:17:05 crc kubenswrapper[4778]: I1205 16:17:05.260591 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f5e616a-22c3-400f-829a-21846578a9d0" path="/var/lib/kubelet/pods/1f5e616a-22c3-400f-829a-21846578a9d0/volumes" Dec 05 16:17:07 crc kubenswrapper[4778]: I1205 16:17:07.150434 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:17:07 crc kubenswrapper[4778]: I1205 16:17:07.150498 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:17:07 crc kubenswrapper[4778]: I1205 16:17:07.409535 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:17:07 crc kubenswrapper[4778]: I1205 16:17:07.512605 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 16:17:11 crc kubenswrapper[4778]: I1205 16:17:11.338905 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"966c547a-2be8-4f17-8b73-6a6904e6d6ef","Type":"ContainerStarted","Data":"b0c1e5eb7bf54433b1e6bc91914515c8d81a086e0ffaed9bc520f2ad824cb7e4"} Dec 05 16:17:11 crc kubenswrapper[4778]: E1205 16:17:11.552426 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" Dec 05 16:17:12 crc kubenswrapper[4778]: E1205 16:17:12.351089 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b\\\"\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" Dec 05 16:17:16 crc kubenswrapper[4778]: I1205 16:17:16.851582 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q"] Dec 05 16:17:16 crc kubenswrapper[4778]: E1205 16:17:16.852626 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f5e616a-22c3-400f-829a-21846578a9d0" containerName="console" Dec 05 16:17:16 crc kubenswrapper[4778]: I1205 16:17:16.852644 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f5e616a-22c3-400f-829a-21846578a9d0" containerName="console" Dec 05 16:17:16 crc kubenswrapper[4778]: I1205 16:17:16.852882 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f5e616a-22c3-400f-829a-21846578a9d0" containerName="console" Dec 05 16:17:16 crc kubenswrapper[4778]: I1205 16:17:16.853729 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" Dec 05 16:17:16 crc kubenswrapper[4778]: I1205 16:17:16.860975 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-db-secret" Dec 05 16:17:16 crc kubenswrapper[4778]: I1205 16:17:16.863066 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q"] Dec 05 16:17:16 crc kubenswrapper[4778]: I1205 16:17:16.900603 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-db-create-xt6b2"] Dec 05 16:17:16 crc kubenswrapper[4778]: I1205 16:17:16.901736 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-xt6b2" Dec 05 16:17:16 crc kubenswrapper[4778]: I1205 16:17:16.912523 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-create-xt6b2"] Dec 05 16:17:16 crc kubenswrapper[4778]: I1205 16:17:16.971574 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zsq8\" (UniqueName: \"kubernetes.io/projected/63653c0c-59aa-47e0-8748-dd487c207a03-kube-api-access-9zsq8\") pod \"keystone-cf4a-account-create-update-7wh4q\" (UID: \"63653c0c-59aa-47e0-8748-dd487c207a03\") " pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" Dec 05 16:17:16 crc kubenswrapper[4778]: I1205 16:17:16.971637 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63653c0c-59aa-47e0-8748-dd487c207a03-operator-scripts\") pod \"keystone-cf4a-account-create-update-7wh4q\" (UID: \"63653c0c-59aa-47e0-8748-dd487c207a03\") " pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.074393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63653c0c-59aa-47e0-8748-dd487c207a03-operator-scripts\") pod \"keystone-cf4a-account-create-update-7wh4q\" (UID: \"63653c0c-59aa-47e0-8748-dd487c207a03\") " pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.074512 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxtt5\" (UniqueName: \"kubernetes.io/projected/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf-kube-api-access-bxtt5\") pod \"keystone-db-create-xt6b2\" (UID: \"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf\") " pod="watcher-kuttl-default/keystone-db-create-xt6b2" Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.074590 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf-operator-scripts\") pod \"keystone-db-create-xt6b2\" (UID: \"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf\") " pod="watcher-kuttl-default/keystone-db-create-xt6b2" Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.074817 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zsq8\" (UniqueName: \"kubernetes.io/projected/63653c0c-59aa-47e0-8748-dd487c207a03-kube-api-access-9zsq8\") pod \"keystone-cf4a-account-create-update-7wh4q\" (UID: \"63653c0c-59aa-47e0-8748-dd487c207a03\") " pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.075054 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63653c0c-59aa-47e0-8748-dd487c207a03-operator-scripts\") pod \"keystone-cf4a-account-create-update-7wh4q\" (UID: \"63653c0c-59aa-47e0-8748-dd487c207a03\") " pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.101258 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zsq8\" (UniqueName: \"kubernetes.io/projected/63653c0c-59aa-47e0-8748-dd487c207a03-kube-api-access-9zsq8\") pod \"keystone-cf4a-account-create-update-7wh4q\" (UID: \"63653c0c-59aa-47e0-8748-dd487c207a03\") " pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.176846 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxtt5\" (UniqueName: \"kubernetes.io/projected/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf-kube-api-access-bxtt5\") pod \"keystone-db-create-xt6b2\" (UID: \"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf\") " pod="watcher-kuttl-default/keystone-db-create-xt6b2" Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.176996 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf-operator-scripts\") pod \"keystone-db-create-xt6b2\" (UID: \"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf\") " pod="watcher-kuttl-default/keystone-db-create-xt6b2" Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.177920 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf-operator-scripts\") pod \"keystone-db-create-xt6b2\" (UID: \"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf\") " pod="watcher-kuttl-default/keystone-db-create-xt6b2" Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.201234 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxtt5\" (UniqueName: \"kubernetes.io/projected/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf-kube-api-access-bxtt5\") pod \"keystone-db-create-xt6b2\" (UID: \"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf\") " pod="watcher-kuttl-default/keystone-db-create-xt6b2" Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.216352 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.228062 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-xt6b2" Dec 05 16:17:17 crc kubenswrapper[4778]: W1205 16:17:17.692389 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63653c0c_59aa_47e0_8748_dd487c207a03.slice/crio-2eb6e498ea38c2e9b2de88e7738d5890ea5082c8b174fdb02963de866a8a49ca WatchSource:0}: Error finding container 2eb6e498ea38c2e9b2de88e7738d5890ea5082c8b174fdb02963de866a8a49ca: Status 404 returned error can't find the container with id 2eb6e498ea38c2e9b2de88e7738d5890ea5082c8b174fdb02963de866a8a49ca Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.698265 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q"] Dec 05 16:17:17 crc kubenswrapper[4778]: I1205 16:17:17.749589 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-create-xt6b2"] Dec 05 16:17:17 crc kubenswrapper[4778]: W1205 16:17:17.761657 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c3c6ad9_282f_408e_af9d_c4053b6c5ddf.slice/crio-eeacf6b884dd178ec6b608882795a6010dd59ddf79a36e16763c9e59aa92a45c WatchSource:0}: Error finding container eeacf6b884dd178ec6b608882795a6010dd59ddf79a36e16763c9e59aa92a45c: Status 404 returned error can't find the container with id eeacf6b884dd178ec6b608882795a6010dd59ddf79a36e16763c9e59aa92a45c Dec 05 16:17:18 crc kubenswrapper[4778]: I1205 16:17:18.410234 4778 generic.go:334] "Generic (PLEG): container finished" podID="63653c0c-59aa-47e0-8748-dd487c207a03" containerID="608c15605560825a423471eafd8c2268c1fae110483ad831b002deeeb4b3f41b" exitCode=0 Dec 05 16:17:18 crc kubenswrapper[4778]: I1205 16:17:18.410479 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" event={"ID":"63653c0c-59aa-47e0-8748-dd487c207a03","Type":"ContainerDied","Data":"608c15605560825a423471eafd8c2268c1fae110483ad831b002deeeb4b3f41b"} Dec 05 16:17:18 crc kubenswrapper[4778]: I1205 16:17:18.410615 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" event={"ID":"63653c0c-59aa-47e0-8748-dd487c207a03","Type":"ContainerStarted","Data":"2eb6e498ea38c2e9b2de88e7738d5890ea5082c8b174fdb02963de866a8a49ca"} Dec 05 16:17:18 crc kubenswrapper[4778]: I1205 16:17:18.412276 4778 generic.go:334] "Generic (PLEG): container finished" podID="3c3c6ad9-282f-408e-af9d-c4053b6c5ddf" containerID="65a9ad3dc7d5495c002b0f8976b4c53e8ad1e97cfad9e600c934cb87a0468ca6" exitCode=0 Dec 05 16:17:18 crc kubenswrapper[4778]: I1205 16:17:18.412304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-xt6b2" event={"ID":"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf","Type":"ContainerDied","Data":"65a9ad3dc7d5495c002b0f8976b4c53e8ad1e97cfad9e600c934cb87a0468ca6"} Dec 05 16:17:18 crc kubenswrapper[4778]: I1205 16:17:18.412333 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-xt6b2" event={"ID":"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf","Type":"ContainerStarted","Data":"eeacf6b884dd178ec6b608882795a6010dd59ddf79a36e16763c9e59aa92a45c"} Dec 05 16:17:19 crc kubenswrapper[4778]: I1205 16:17:19.760531 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" Dec 05 16:17:19 crc kubenswrapper[4778]: I1205 16:17:19.763936 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-xt6b2" Dec 05 16:17:19 crc kubenswrapper[4778]: I1205 16:17:19.918217 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63653c0c-59aa-47e0-8748-dd487c207a03-operator-scripts\") pod \"63653c0c-59aa-47e0-8748-dd487c207a03\" (UID: \"63653c0c-59aa-47e0-8748-dd487c207a03\") " Dec 05 16:17:19 crc kubenswrapper[4778]: I1205 16:17:19.918274 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf-operator-scripts\") pod \"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf\" (UID: \"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf\") " Dec 05 16:17:19 crc kubenswrapper[4778]: I1205 16:17:19.918307 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxtt5\" (UniqueName: \"kubernetes.io/projected/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf-kube-api-access-bxtt5\") pod \"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf\" (UID: \"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf\") " Dec 05 16:17:19 crc kubenswrapper[4778]: I1205 16:17:19.918442 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zsq8\" (UniqueName: \"kubernetes.io/projected/63653c0c-59aa-47e0-8748-dd487c207a03-kube-api-access-9zsq8\") pod \"63653c0c-59aa-47e0-8748-dd487c207a03\" (UID: \"63653c0c-59aa-47e0-8748-dd487c207a03\") " Dec 05 16:17:19 crc kubenswrapper[4778]: I1205 16:17:19.919780 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63653c0c-59aa-47e0-8748-dd487c207a03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63653c0c-59aa-47e0-8748-dd487c207a03" (UID: "63653c0c-59aa-47e0-8748-dd487c207a03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:17:19 crc kubenswrapper[4778]: I1205 16:17:19.919812 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c3c6ad9-282f-408e-af9d-c4053b6c5ddf" (UID: "3c3c6ad9-282f-408e-af9d-c4053b6c5ddf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:17:19 crc kubenswrapper[4778]: I1205 16:17:19.929757 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf-kube-api-access-bxtt5" (OuterVolumeSpecName: "kube-api-access-bxtt5") pod "3c3c6ad9-282f-408e-af9d-c4053b6c5ddf" (UID: "3c3c6ad9-282f-408e-af9d-c4053b6c5ddf"). InnerVolumeSpecName "kube-api-access-bxtt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:17:19 crc kubenswrapper[4778]: I1205 16:17:19.934615 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63653c0c-59aa-47e0-8748-dd487c207a03-kube-api-access-9zsq8" (OuterVolumeSpecName: "kube-api-access-9zsq8") pod "63653c0c-59aa-47e0-8748-dd487c207a03" (UID: "63653c0c-59aa-47e0-8748-dd487c207a03"). InnerVolumeSpecName "kube-api-access-9zsq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:17:20 crc kubenswrapper[4778]: I1205 16:17:20.020499 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63653c0c-59aa-47e0-8748-dd487c207a03-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:20 crc kubenswrapper[4778]: I1205 16:17:20.020540 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:20 crc kubenswrapper[4778]: I1205 16:17:20.020558 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxtt5\" (UniqueName: \"kubernetes.io/projected/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf-kube-api-access-bxtt5\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:20 crc kubenswrapper[4778]: I1205 16:17:20.020572 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zsq8\" (UniqueName: \"kubernetes.io/projected/63653c0c-59aa-47e0-8748-dd487c207a03-kube-api-access-9zsq8\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:20 crc kubenswrapper[4778]: I1205 16:17:20.434651 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" event={"ID":"63653c0c-59aa-47e0-8748-dd487c207a03","Type":"ContainerDied","Data":"2eb6e498ea38c2e9b2de88e7738d5890ea5082c8b174fdb02963de866a8a49ca"} Dec 05 16:17:20 crc kubenswrapper[4778]: I1205 16:17:20.434688 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eb6e498ea38c2e9b2de88e7738d5890ea5082c8b174fdb02963de866a8a49ca" Dec 05 16:17:20 crc kubenswrapper[4778]: I1205 16:17:20.434691 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q" Dec 05 16:17:20 crc kubenswrapper[4778]: I1205 16:17:20.436303 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-xt6b2" event={"ID":"3c3c6ad9-282f-408e-af9d-c4053b6c5ddf","Type":"ContainerDied","Data":"eeacf6b884dd178ec6b608882795a6010dd59ddf79a36e16763c9e59aa92a45c"} Dec 05 16:17:20 crc kubenswrapper[4778]: I1205 16:17:20.436473 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeacf6b884dd178ec6b608882795a6010dd59ddf79a36e16763c9e59aa92a45c" Dec 05 16:17:20 crc kubenswrapper[4778]: I1205 16:17:20.436846 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-xt6b2" Dec 05 16:17:21 crc kubenswrapper[4778]: I1205 16:17:21.446817 4778 generic.go:334] "Generic (PLEG): container finished" podID="daf89267-199f-4532-b4f7-a74fc2ef5425" containerID="e6f332fb9d6fdd90b3000588be004e4bac9f2726275eeadcccede6cff5e62788" exitCode=0 Dec 05 16:17:21 crc kubenswrapper[4778]: I1205 16:17:21.446921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"daf89267-199f-4532-b4f7-a74fc2ef5425","Type":"ContainerDied","Data":"e6f332fb9d6fdd90b3000588be004e4bac9f2726275eeadcccede6cff5e62788"} Dec 05 16:17:22 crc kubenswrapper[4778]: I1205 16:17:22.454756 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"daf89267-199f-4532-b4f7-a74fc2ef5425","Type":"ContainerStarted","Data":"4563c49cbf89e93934569de1663c7e937584a3ef51a7986e5c2fcda5ddb27cdd"} Dec 05 16:17:22 crc kubenswrapper[4778]: I1205 16:17:22.455201 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:17:22 crc kubenswrapper[4778]: I1205 16:17:22.481111 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/rabbitmq-server-0" podStartSLOduration=37.865201339 podStartE2EDuration="1m9.481093652s" podCreationTimestamp="2025-12-05 16:16:13 +0000 UTC" firstStartedPulling="2025-12-05 16:16:16.083708254 +0000 UTC m=+1263.187504644" lastFinishedPulling="2025-12-05 16:16:47.699600557 +0000 UTC m=+1294.803396957" observedRunningTime="2025-12-05 16:17:22.473736937 +0000 UTC m=+1329.577533317" watchObservedRunningTime="2025-12-05 16:17:22.481093652 +0000 UTC m=+1329.584890032" Dec 05 16:17:29 crc kubenswrapper[4778]: I1205 16:17:29.513536 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"966c547a-2be8-4f17-8b73-6a6904e6d6ef","Type":"ContainerStarted","Data":"75ff93fe0442360bd45cb613b5ebbd4eb6e861fd5639a56909207c753acec3db"} Dec 05 16:17:29 crc kubenswrapper[4778]: I1205 16:17:29.515906 4778 generic.go:334] "Generic (PLEG): container finished" podID="f73e1f56-f326-4886-9a0d-8f72407ebeb6" containerID="23d2decd8cfc30b3efb75bc47a6044b70036e730721a95e844c61683dce67f7a" exitCode=0 Dec 05 16:17:29 crc kubenswrapper[4778]: I1205 16:17:29.515960 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"f73e1f56-f326-4886-9a0d-8f72407ebeb6","Type":"ContainerDied","Data":"23d2decd8cfc30b3efb75bc47a6044b70036e730721a95e844c61683dce67f7a"} Dec 05 16:17:29 crc kubenswrapper[4778]: I1205 16:17:29.545926 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/prometheus-metric-storage-0" podStartSLOduration=4.07682557 podStartE2EDuration="1m12.54591068s" podCreationTimestamp="2025-12-05 16:16:17 +0000 UTC" firstStartedPulling="2025-12-05 16:16:20.147239193 +0000 UTC m=+1267.251035583" lastFinishedPulling="2025-12-05 16:17:28.616324303 +0000 UTC m=+1335.720120693" observedRunningTime="2025-12-05 16:17:29.539411097 +0000 UTC m=+1336.643207517" watchObservedRunningTime="2025-12-05 16:17:29.54591068 +0000 UTC m=+1336.649707060" Dec 05 16:17:30 crc kubenswrapper[4778]: I1205 16:17:30.525244 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"f73e1f56-f326-4886-9a0d-8f72407ebeb6","Type":"ContainerStarted","Data":"a2b5b5e9e6fc6174a99e316b9a5ba4c1082c7b02cd93e8ae7c86150717fb31dd"} Dec 05 16:17:30 crc kubenswrapper[4778]: I1205 16:17:30.525774 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:17:30 crc kubenswrapper[4778]: I1205 16:17:30.545030 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" podStartSLOduration=-9223371960.309761 podStartE2EDuration="1m16.545014887s" podCreationTimestamp="2025-12-05 16:16:14 +0000 UTC" firstStartedPulling="2025-12-05 16:16:16.439185221 +0000 UTC m=+1263.542981601" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:30.543529098 +0000 UTC m=+1337.647325478" watchObservedRunningTime="2025-12-05 16:17:30.545014887 +0000 UTC m=+1337.648811267" Dec 05 16:17:33 crc kubenswrapper[4778]: I1205 16:17:33.414387 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:17:33 crc kubenswrapper[4778]: I1205 16:17:33.414722 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:17:33 crc kubenswrapper[4778]: I1205 16:17:33.414763 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:17:33 crc kubenswrapper[4778]: I1205 16:17:33.415438 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9838a46c7fca5484e5528acba6a6dc7600262ec3d0517e19089e823847361767"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:17:33 crc kubenswrapper[4778]: I1205 16:17:33.415501 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://9838a46c7fca5484e5528acba6a6dc7600262ec3d0517e19089e823847361767" gracePeriod=600 Dec 05 16:17:33 crc kubenswrapper[4778]: I1205 16:17:33.553635 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="9838a46c7fca5484e5528acba6a6dc7600262ec3d0517e19089e823847361767" exitCode=0 Dec 05 16:17:33 crc kubenswrapper[4778]: I1205 16:17:33.553695 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"9838a46c7fca5484e5528acba6a6dc7600262ec3d0517e19089e823847361767"} Dec 05 16:17:33 crc kubenswrapper[4778]: I1205 16:17:33.554011 4778 scope.go:117] "RemoveContainer" containerID="aea0312e36d87c23ce634b679d6ae2137df783585ff65eb7e4e65c9564abd0b6" Dec 05 16:17:34 crc kubenswrapper[4778]: I1205 16:17:34.325631 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:34 crc kubenswrapper[4778]: I1205 16:17:34.325976 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:34 crc kubenswrapper[4778]: I1205 16:17:34.330188 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:34 crc kubenswrapper[4778]: I1205 16:17:34.563637 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0"} Dec 05 16:17:34 crc kubenswrapper[4778]: I1205 16:17:34.565032 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:35 crc kubenswrapper[4778]: I1205 16:17:35.325524 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 16:17:35 crc kubenswrapper[4778]: I1205 16:17:35.923127 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-db-sync-5996h"] Dec 05 16:17:35 crc kubenswrapper[4778]: E1205 16:17:35.923696 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3c6ad9-282f-408e-af9d-c4053b6c5ddf" containerName="mariadb-database-create" Dec 05 16:17:35 crc kubenswrapper[4778]: I1205 16:17:35.923711 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3c6ad9-282f-408e-af9d-c4053b6c5ddf" containerName="mariadb-database-create" Dec 05 16:17:35 crc kubenswrapper[4778]: E1205 16:17:35.923725 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63653c0c-59aa-47e0-8748-dd487c207a03" containerName="mariadb-account-create-update" Dec 05 16:17:35 crc kubenswrapper[4778]: I1205 16:17:35.923732 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="63653c0c-59aa-47e0-8748-dd487c207a03" containerName="mariadb-account-create-update" Dec 05 16:17:35 crc kubenswrapper[4778]: I1205 16:17:35.923867 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c3c6ad9-282f-408e-af9d-c4053b6c5ddf" containerName="mariadb-database-create" Dec 05 16:17:35 crc kubenswrapper[4778]: I1205 16:17:35.923878 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="63653c0c-59aa-47e0-8748-dd487c207a03" containerName="mariadb-account-create-update" Dec 05 16:17:35 crc kubenswrapper[4778]: I1205 16:17:35.924446 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:17:35 crc kubenswrapper[4778]: I1205 16:17:35.926761 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Dec 05 16:17:35 crc kubenswrapper[4778]: I1205 16:17:35.927283 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Dec 05 16:17:35 crc kubenswrapper[4778]: I1205 16:17:35.927309 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Dec 05 16:17:35 crc kubenswrapper[4778]: I1205 16:17:35.928617 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-s5hnz" Dec 05 16:17:35 crc kubenswrapper[4778]: I1205 16:17:35.936714 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-5996h"] Dec 05 16:17:36 crc kubenswrapper[4778]: I1205 16:17:36.062678 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94083ec4-63e0-44b2-9181-808017479ef8-combined-ca-bundle\") pod \"keystone-db-sync-5996h\" (UID: \"94083ec4-63e0-44b2-9181-808017479ef8\") " pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:17:36 crc kubenswrapper[4778]: I1205 16:17:36.062729 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94083ec4-63e0-44b2-9181-808017479ef8-config-data\") pod \"keystone-db-sync-5996h\" (UID: \"94083ec4-63e0-44b2-9181-808017479ef8\") " pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:17:36 crc kubenswrapper[4778]: I1205 16:17:36.062909 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt84f\" (UniqueName: \"kubernetes.io/projected/94083ec4-63e0-44b2-9181-808017479ef8-kube-api-access-gt84f\") pod \"keystone-db-sync-5996h\" (UID: \"94083ec4-63e0-44b2-9181-808017479ef8\") " pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:17:36 crc kubenswrapper[4778]: I1205 16:17:36.164504 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94083ec4-63e0-44b2-9181-808017479ef8-combined-ca-bundle\") pod \"keystone-db-sync-5996h\" (UID: \"94083ec4-63e0-44b2-9181-808017479ef8\") " pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:17:36 crc kubenswrapper[4778]: I1205 16:17:36.164618 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94083ec4-63e0-44b2-9181-808017479ef8-config-data\") pod \"keystone-db-sync-5996h\" (UID: \"94083ec4-63e0-44b2-9181-808017479ef8\") " pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:17:36 crc kubenswrapper[4778]: I1205 16:17:36.164720 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt84f\" (UniqueName: \"kubernetes.io/projected/94083ec4-63e0-44b2-9181-808017479ef8-kube-api-access-gt84f\") pod \"keystone-db-sync-5996h\" (UID: \"94083ec4-63e0-44b2-9181-808017479ef8\") " pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:17:36 crc kubenswrapper[4778]: I1205 16:17:36.170664 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94083ec4-63e0-44b2-9181-808017479ef8-combined-ca-bundle\") pod \"keystone-db-sync-5996h\" (UID: \"94083ec4-63e0-44b2-9181-808017479ef8\") " pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:17:36 crc kubenswrapper[4778]: I1205 16:17:36.180583 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94083ec4-63e0-44b2-9181-808017479ef8-config-data\") pod \"keystone-db-sync-5996h\" (UID: \"94083ec4-63e0-44b2-9181-808017479ef8\") " pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:17:36 crc kubenswrapper[4778]: I1205 16:17:36.194020 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt84f\" (UniqueName: \"kubernetes.io/projected/94083ec4-63e0-44b2-9181-808017479ef8-kube-api-access-gt84f\") pod \"keystone-db-sync-5996h\" (UID: \"94083ec4-63e0-44b2-9181-808017479ef8\") " pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:17:36 crc kubenswrapper[4778]: I1205 16:17:36.246958 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:17:36 crc kubenswrapper[4778]: I1205 16:17:36.515385 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-5996h"] Dec 05 16:17:36 crc kubenswrapper[4778]: W1205 16:17:36.526152 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94083ec4_63e0_44b2_9181_808017479ef8.slice/crio-30d33ae1deb0b31ef5d94265230a5bf91e1c42ba5d053ae88d354b0a3c3b19fe WatchSource:0}: Error finding container 30d33ae1deb0b31ef5d94265230a5bf91e1c42ba5d053ae88d354b0a3c3b19fe: Status 404 returned error can't find the container with id 30d33ae1deb0b31ef5d94265230a5bf91e1c42ba5d053ae88d354b0a3c3b19fe Dec 05 16:17:36 crc kubenswrapper[4778]: I1205 16:17:36.578469 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-5996h" event={"ID":"94083ec4-63e0-44b2-9181-808017479ef8","Type":"ContainerStarted","Data":"30d33ae1deb0b31ef5d94265230a5bf91e1c42ba5d053ae88d354b0a3c3b19fe"} Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.124470 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.125078 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="config-reloader" containerID="cri-o://af97a138be12ac772c50e6acda1ec7a6461e2f05f39e3abb7d2c5038b96be8ae" gracePeriod=600 Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.125195 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="thanos-sidecar" containerID="cri-o://b0c1e5eb7bf54433b1e6bc91914515c8d81a086e0ffaed9bc520f2ad824cb7e4" gracePeriod=600 Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.125168 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="prometheus" containerID="cri-o://75ff93fe0442360bd45cb613b5ebbd4eb6e861fd5639a56909207c753acec3db" gracePeriod=600 Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.612500 4778 generic.go:334] "Generic (PLEG): container finished" podID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerID="75ff93fe0442360bd45cb613b5ebbd4eb6e861fd5639a56909207c753acec3db" exitCode=0 Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.612534 4778 generic.go:334] "Generic (PLEG): container finished" podID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerID="b0c1e5eb7bf54433b1e6bc91914515c8d81a086e0ffaed9bc520f2ad824cb7e4" exitCode=0 Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.612545 4778 generic.go:334] "Generic (PLEG): container finished" podID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerID="af97a138be12ac772c50e6acda1ec7a6461e2f05f39e3abb7d2c5038b96be8ae" exitCode=0 Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.612569 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"966c547a-2be8-4f17-8b73-6a6904e6d6ef","Type":"ContainerDied","Data":"75ff93fe0442360bd45cb613b5ebbd4eb6e861fd5639a56909207c753acec3db"} Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.612598 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"966c547a-2be8-4f17-8b73-6a6904e6d6ef","Type":"ContainerDied","Data":"b0c1e5eb7bf54433b1e6bc91914515c8d81a086e0ffaed9bc520f2ad824cb7e4"} Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.612611 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"966c547a-2be8-4f17-8b73-6a6904e6d6ef","Type":"ContainerDied","Data":"af97a138be12ac772c50e6acda1ec7a6461e2f05f39e3abb7d2c5038b96be8ae"} Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.765377 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.894182 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-web-config\") pod \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.894267 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/966c547a-2be8-4f17-8b73-6a6904e6d6ef-config-out\") pod \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.894389 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-thanos-prometheus-http-client-file\") pod \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.894427 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-config\") pod \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.894476 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/966c547a-2be8-4f17-8b73-6a6904e6d6ef-prometheus-metric-storage-rulefiles-0\") pod \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.894916 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\") pod \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.894975 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/966c547a-2be8-4f17-8b73-6a6904e6d6ef-tls-assets\") pod \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.895016 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnhtg\" (UniqueName: \"kubernetes.io/projected/966c547a-2be8-4f17-8b73-6a6904e6d6ef-kube-api-access-bnhtg\") pod \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\" (UID: \"966c547a-2be8-4f17-8b73-6a6904e6d6ef\") " Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.895979 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/966c547a-2be8-4f17-8b73-6a6904e6d6ef-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "966c547a-2be8-4f17-8b73-6a6904e6d6ef" (UID: "966c547a-2be8-4f17-8b73-6a6904e6d6ef"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.902394 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "966c547a-2be8-4f17-8b73-6a6904e6d6ef" (UID: "966c547a-2be8-4f17-8b73-6a6904e6d6ef"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.902916 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-config" (OuterVolumeSpecName: "config") pod "966c547a-2be8-4f17-8b73-6a6904e6d6ef" (UID: "966c547a-2be8-4f17-8b73-6a6904e6d6ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.904197 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/966c547a-2be8-4f17-8b73-6a6904e6d6ef-config-out" (OuterVolumeSpecName: "config-out") pod "966c547a-2be8-4f17-8b73-6a6904e6d6ef" (UID: "966c547a-2be8-4f17-8b73-6a6904e6d6ef"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.905080 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966c547a-2be8-4f17-8b73-6a6904e6d6ef-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "966c547a-2be8-4f17-8b73-6a6904e6d6ef" (UID: "966c547a-2be8-4f17-8b73-6a6904e6d6ef"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.922744 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "966c547a-2be8-4f17-8b73-6a6904e6d6ef" (UID: "966c547a-2be8-4f17-8b73-6a6904e6d6ef"). InnerVolumeSpecName "pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.924118 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966c547a-2be8-4f17-8b73-6a6904e6d6ef-kube-api-access-bnhtg" (OuterVolumeSpecName: "kube-api-access-bnhtg") pod "966c547a-2be8-4f17-8b73-6a6904e6d6ef" (UID: "966c547a-2be8-4f17-8b73-6a6904e6d6ef"). InnerVolumeSpecName "kube-api-access-bnhtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:17:37 crc kubenswrapper[4778]: I1205 16:17:37.929459 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-web-config" (OuterVolumeSpecName: "web-config") pod "966c547a-2be8-4f17-8b73-6a6904e6d6ef" (UID: "966c547a-2be8-4f17-8b73-6a6904e6d6ef"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.000199 4778 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/966c547a-2be8-4f17-8b73-6a6904e6d6ef-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.000236 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnhtg\" (UniqueName: \"kubernetes.io/projected/966c547a-2be8-4f17-8b73-6a6904e6d6ef-kube-api-access-bnhtg\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.000248 4778 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-web-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.000256 4778 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/966c547a-2be8-4f17-8b73-6a6904e6d6ef-config-out\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.000267 4778 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.000275 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/966c547a-2be8-4f17-8b73-6a6904e6d6ef-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.000284 4778 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/966c547a-2be8-4f17-8b73-6a6904e6d6ef-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.000324 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\") on node \"crc\" " Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.021820 4778 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.021992 4778 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044") on node "crc" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.102096 4778 reconciler_common.go:293] "Volume detached for volume \"pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\") on node \"crc\" DevicePath \"\"" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.634383 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"966c547a-2be8-4f17-8b73-6a6904e6d6ef","Type":"ContainerDied","Data":"43ad9fb80a7ea7a0e9ce6e2ff28dfc014305e99f0edb2ab7fd1420e553825660"} Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.634953 4778 scope.go:117] "RemoveContainer" containerID="75ff93fe0442360bd45cb613b5ebbd4eb6e861fd5639a56909207c753acec3db" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.634450 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.668775 4778 scope.go:117] "RemoveContainer" containerID="b0c1e5eb7bf54433b1e6bc91914515c8d81a086e0ffaed9bc520f2ad824cb7e4" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.689233 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.699720 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.713403 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 16:17:38 crc kubenswrapper[4778]: E1205 16:17:38.713817 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="prometheus" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.713834 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="prometheus" Dec 05 16:17:38 crc kubenswrapper[4778]: E1205 16:17:38.713842 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="thanos-sidecar" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.713848 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="thanos-sidecar" Dec 05 16:17:38 crc kubenswrapper[4778]: E1205 16:17:38.713854 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="config-reloader" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.713861 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="config-reloader" Dec 05 16:17:38 crc kubenswrapper[4778]: E1205 16:17:38.713877 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="init-config-reloader" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.713882 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="init-config-reloader" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.714058 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="prometheus" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.714069 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="config-reloader" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.714082 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" containerName="thanos-sidecar" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.714967 4778 scope.go:117] "RemoveContainer" containerID="af97a138be12ac772c50e6acda1ec7a6461e2f05f39e3abb7d2c5038b96be8ae" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.715519 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.720743 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-prometheus-dockercfg-m9tj2" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.721021 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.721169 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.721299 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.721509 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-web-config" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.721648 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-metric-storage-prometheus-svc" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.727747 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-tls-assets-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.731413 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.786156 4778 scope.go:117] "RemoveContainer" containerID="27c4f772336a6d6f5d0b4e945f22dbbfc1e55d42f9bbd8dcdac95f41db86583d" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.812354 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.812404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.812429 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c124cdbe-b3e4-465b-8657-9b749da2e709-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.812467 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.812486 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c124cdbe-b3e4-465b-8657-9b749da2e709-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.812504 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh7xx\" (UniqueName: \"kubernetes.io/projected/c124cdbe-b3e4-465b-8657-9b749da2e709-kube-api-access-qh7xx\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.812536 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c124cdbe-b3e4-465b-8657-9b749da2e709-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.812574 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-config\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.812589 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.812611 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.812637 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.913810 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.913896 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.913932 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.913962 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c124cdbe-b3e4-465b-8657-9b749da2e709-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.914012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.914038 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c124cdbe-b3e4-465b-8657-9b749da2e709-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.914062 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh7xx\" (UniqueName: \"kubernetes.io/projected/c124cdbe-b3e4-465b-8657-9b749da2e709-kube-api-access-qh7xx\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.914111 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c124cdbe-b3e4-465b-8657-9b749da2e709-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.914165 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-config\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.914187 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.914217 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.918652 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c124cdbe-b3e4-465b-8657-9b749da2e709-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.919614 4778 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.919656 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cb4b5983a897e7197d99054548cb4b72c2c3c3eece65a6205a52532f13a56352/globalmount\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.921533 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c124cdbe-b3e4-465b-8657-9b749da2e709-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.921804 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c124cdbe-b3e4-465b-8657-9b749da2e709-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.922153 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-config\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.923018 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.923096 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.923908 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.932286 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.934317 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c124cdbe-b3e4-465b-8657-9b749da2e709-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.940876 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh7xx\" (UniqueName: \"kubernetes.io/projected/c124cdbe-b3e4-465b-8657-9b749da2e709-kube-api-access-qh7xx\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:38 crc kubenswrapper[4778]: I1205 16:17:38.967491 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23e36b7-6bca-4c29-9eb0-4115e85ce044\") pod \"prometheus-metric-storage-0\" (UID: \"c124cdbe-b3e4-465b-8657-9b749da2e709\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:39 crc kubenswrapper[4778]: I1205 16:17:39.049358 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:17:39 crc kubenswrapper[4778]: I1205 16:17:39.262001 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="966c547a-2be8-4f17-8b73-6a6904e6d6ef" path="/var/lib/kubelet/pods/966c547a-2be8-4f17-8b73-6a6904e6d6ef/volumes" Dec 05 16:17:39 crc kubenswrapper[4778]: W1205 16:17:39.492873 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc124cdbe_b3e4_465b_8657_9b749da2e709.slice/crio-6005299248ccb461a4f77c6bf4a6601b8977568c91056415b0c61c393c55aa19 WatchSource:0}: Error finding container 6005299248ccb461a4f77c6bf4a6601b8977568c91056415b0c61c393c55aa19: Status 404 returned error can't find the container with id 6005299248ccb461a4f77c6bf4a6601b8977568c91056415b0c61c393c55aa19 Dec 05 16:17:39 crc kubenswrapper[4778]: I1205 16:17:39.495072 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 16:17:39 crc kubenswrapper[4778]: I1205 16:17:39.654828 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c124cdbe-b3e4-465b-8657-9b749da2e709","Type":"ContainerStarted","Data":"6005299248ccb461a4f77c6bf4a6601b8977568c91056415b0c61c393c55aa19"} Dec 05 16:17:42 crc kubenswrapper[4778]: I1205 16:17:42.683122 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c124cdbe-b3e4-465b-8657-9b749da2e709","Type":"ContainerStarted","Data":"f95e84651e4c5118e85a9d7fd65a46c442a44617f0395ca153e8125388e54f4e"} Dec 05 16:17:45 crc kubenswrapper[4778]: I1205 16:17:45.936111 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 16:17:52 crc kubenswrapper[4778]: E1205 16:17:52.955536 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-keystone:current-podified" Dec 05 16:17:52 crc kubenswrapper[4778]: E1205 16:17:52.956095 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:keystone-db-sync,Image:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,Command:[/bin/bash],Args:[-c keystone-manage db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/keystone/keystone.conf,SubPath:keystone.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gt84f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42425,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42425,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-db-sync-5996h_watcher-kuttl-default(94083ec4-63e0-44b2-9181-808017479ef8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:17:52 crc kubenswrapper[4778]: E1205 16:17:52.957605 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/keystone-db-sync-5996h" podUID="94083ec4-63e0-44b2-9181-808017479ef8" Dec 05 16:17:53 crc kubenswrapper[4778]: I1205 16:17:53.775914 4778 generic.go:334] "Generic (PLEG): container finished" podID="c124cdbe-b3e4-465b-8657-9b749da2e709" containerID="f95e84651e4c5118e85a9d7fd65a46c442a44617f0395ca153e8125388e54f4e" exitCode=0 Dec 05 16:17:53 crc kubenswrapper[4778]: I1205 16:17:53.777324 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c124cdbe-b3e4-465b-8657-9b749da2e709","Type":"ContainerDied","Data":"f95e84651e4c5118e85a9d7fd65a46c442a44617f0395ca153e8125388e54f4e"} Dec 05 16:17:53 crc kubenswrapper[4778]: E1205 16:17:53.780990 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-keystone:current-podified\\\"\"" pod="watcher-kuttl-default/keystone-db-sync-5996h" podUID="94083ec4-63e0-44b2-9181-808017479ef8" Dec 05 16:17:54 crc kubenswrapper[4778]: I1205 16:17:54.785892 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c124cdbe-b3e4-465b-8657-9b749da2e709","Type":"ContainerStarted","Data":"ebdcfde5a043fa89bc6c3379cd9dbdeb5817dcd2327f06bd28b4dcdfa1809996"} Dec 05 16:17:56 crc kubenswrapper[4778]: I1205 16:17:56.800990 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c124cdbe-b3e4-465b-8657-9b749da2e709","Type":"ContainerStarted","Data":"312752d1822165b9ac52826063261fd3a1d617c26f7bef40bdfe5950fdbfedf4"} Dec 05 16:17:57 crc kubenswrapper[4778]: I1205 16:17:57.814506 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c124cdbe-b3e4-465b-8657-9b749da2e709","Type":"ContainerStarted","Data":"ad2f5556f4da99b5a1e9e2e5e6fe01a4dd915ba0a068533056c4e5c445ffef98"} Dec 05 16:17:57 crc kubenswrapper[4778]: I1205 16:17:57.850699 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/prometheus-metric-storage-0" podStartSLOduration=19.85067534 podStartE2EDuration="19.85067534s" podCreationTimestamp="2025-12-05 16:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:57.842317244 +0000 UTC m=+1364.946113664" watchObservedRunningTime="2025-12-05 16:17:57.85067534 +0000 UTC m=+1364.954471730" Dec 05 16:17:59 crc kubenswrapper[4778]: I1205 16:17:59.050436 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:18:06 crc kubenswrapper[4778]: I1205 16:18:06.886833 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-5996h" event={"ID":"94083ec4-63e0-44b2-9181-808017479ef8","Type":"ContainerStarted","Data":"eb32e4acd688ec98c6ee444f3c942582c9aba4e2af4117b37984f66d3a2a48ea"} Dec 05 16:18:06 crc kubenswrapper[4778]: I1205 16:18:06.912700 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-db-sync-5996h" podStartSLOduration=2.582095748 podStartE2EDuration="31.912678011s" podCreationTimestamp="2025-12-05 16:17:35 +0000 UTC" firstStartedPulling="2025-12-05 16:17:36.529424412 +0000 UTC m=+1343.633220792" lastFinishedPulling="2025-12-05 16:18:05.860006675 +0000 UTC m=+1372.963803055" observedRunningTime="2025-12-05 16:18:06.908007188 +0000 UTC m=+1374.011803588" watchObservedRunningTime="2025-12-05 16:18:06.912678011 +0000 UTC m=+1374.016474391" Dec 05 16:18:09 crc kubenswrapper[4778]: I1205 16:18:09.050245 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:18:09 crc kubenswrapper[4778]: I1205 16:18:09.072903 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:18:09 crc kubenswrapper[4778]: I1205 16:18:09.910703 4778 generic.go:334] "Generic (PLEG): container finished" podID="94083ec4-63e0-44b2-9181-808017479ef8" containerID="eb32e4acd688ec98c6ee444f3c942582c9aba4e2af4117b37984f66d3a2a48ea" exitCode=0 Dec 05 16:18:09 crc kubenswrapper[4778]: I1205 16:18:09.910790 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-5996h" event={"ID":"94083ec4-63e0-44b2-9181-808017479ef8","Type":"ContainerDied","Data":"eb32e4acd688ec98c6ee444f3c942582c9aba4e2af4117b37984f66d3a2a48ea"} Dec 05 16:18:09 crc kubenswrapper[4778]: I1205 16:18:09.915628 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.209740 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.297144 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94083ec4-63e0-44b2-9181-808017479ef8-config-data\") pod \"94083ec4-63e0-44b2-9181-808017479ef8\" (UID: \"94083ec4-63e0-44b2-9181-808017479ef8\") " Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.297203 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt84f\" (UniqueName: \"kubernetes.io/projected/94083ec4-63e0-44b2-9181-808017479ef8-kube-api-access-gt84f\") pod \"94083ec4-63e0-44b2-9181-808017479ef8\" (UID: \"94083ec4-63e0-44b2-9181-808017479ef8\") " Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.297273 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94083ec4-63e0-44b2-9181-808017479ef8-combined-ca-bundle\") pod \"94083ec4-63e0-44b2-9181-808017479ef8\" (UID: \"94083ec4-63e0-44b2-9181-808017479ef8\") " Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.302108 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94083ec4-63e0-44b2-9181-808017479ef8-kube-api-access-gt84f" (OuterVolumeSpecName: "kube-api-access-gt84f") pod "94083ec4-63e0-44b2-9181-808017479ef8" (UID: "94083ec4-63e0-44b2-9181-808017479ef8"). InnerVolumeSpecName "kube-api-access-gt84f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.319795 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94083ec4-63e0-44b2-9181-808017479ef8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94083ec4-63e0-44b2-9181-808017479ef8" (UID: "94083ec4-63e0-44b2-9181-808017479ef8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.335525 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94083ec4-63e0-44b2-9181-808017479ef8-config-data" (OuterVolumeSpecName: "config-data") pod "94083ec4-63e0-44b2-9181-808017479ef8" (UID: "94083ec4-63e0-44b2-9181-808017479ef8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.399068 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94083ec4-63e0-44b2-9181-808017479ef8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.399098 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94083ec4-63e0-44b2-9181-808017479ef8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.399107 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt84f\" (UniqueName: \"kubernetes.io/projected/94083ec4-63e0-44b2-9181-808017479ef8-kube-api-access-gt84f\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.928268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-5996h" event={"ID":"94083ec4-63e0-44b2-9181-808017479ef8","Type":"ContainerDied","Data":"30d33ae1deb0b31ef5d94265230a5bf91e1c42ba5d053ae88d354b0a3c3b19fe"} Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.928320 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30d33ae1deb0b31ef5d94265230a5bf91e1c42ba5d053ae88d354b0a3c3b19fe" Dec 05 16:18:11 crc kubenswrapper[4778]: I1205 16:18:11.928338 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-5996h" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.146075 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-gdlgn"] Dec 05 16:18:12 crc kubenswrapper[4778]: E1205 16:18:12.146960 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94083ec4-63e0-44b2-9181-808017479ef8" containerName="keystone-db-sync" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.146989 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="94083ec4-63e0-44b2-9181-808017479ef8" containerName="keystone-db-sync" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.147294 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="94083ec4-63e0-44b2-9181-808017479ef8" containerName="keystone-db-sync" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.148229 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.153136 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.153227 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.153228 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.153346 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"osp-secret" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.153520 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-s5hnz" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.156623 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-gdlgn"] Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.213481 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-fernet-keys\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.213524 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-credential-keys\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.213553 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-config-data\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.213648 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-combined-ca-bundle\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.213700 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgnwk\" (UniqueName: \"kubernetes.io/projected/9028e391-6a57-4546-8f3c-8a0eecc5f053-kube-api-access-lgnwk\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.213743 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-scripts\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.277326 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.310734 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.314892 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.315071 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.315124 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.316234 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgnwk\" (UniqueName: \"kubernetes.io/projected/9028e391-6a57-4546-8f3c-8a0eecc5f053-kube-api-access-lgnwk\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.316639 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-scripts\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.316689 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-fernet-keys\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.316712 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-credential-keys\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.316754 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-config-data\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.316806 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-combined-ca-bundle\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.326231 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-credential-keys\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.326387 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-fernet-keys\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.327032 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-config-data\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.340315 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-combined-ca-bundle\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.340968 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-scripts\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.344261 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgnwk\" (UniqueName: \"kubernetes.io/projected/9028e391-6a57-4546-8f3c-8a0eecc5f053-kube-api-access-lgnwk\") pod \"keystone-bootstrap-gdlgn\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.418135 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvll9\" (UniqueName: \"kubernetes.io/projected/156cc8f8-5c0b-46c9-ae14-3a3493473f75-kube-api-access-gvll9\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.418200 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-config-data\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.418227 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/156cc8f8-5c0b-46c9-ae14-3a3493473f75-run-httpd\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.418531 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/156cc8f8-5c0b-46c9-ae14-3a3493473f75-log-httpd\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.418651 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.418790 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.418836 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-scripts\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.464670 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.520576 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.521054 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-scripts\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.521102 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvll9\" (UniqueName: \"kubernetes.io/projected/156cc8f8-5c0b-46c9-ae14-3a3493473f75-kube-api-access-gvll9\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.521123 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-config-data\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.521139 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/156cc8f8-5c0b-46c9-ae14-3a3493473f75-run-httpd\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.521187 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/156cc8f8-5c0b-46c9-ae14-3a3493473f75-log-httpd\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.521234 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.524341 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/156cc8f8-5c0b-46c9-ae14-3a3493473f75-run-httpd\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.524663 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/156cc8f8-5c0b-46c9-ae14-3a3493473f75-log-httpd\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.524833 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-scripts\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.525161 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.527445 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-config-data\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.531242 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.539591 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvll9\" (UniqueName: \"kubernetes.io/projected/156cc8f8-5c0b-46c9-ae14-3a3493473f75-kube-api-access-gvll9\") pod \"ceilometer-0\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.694758 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:12 crc kubenswrapper[4778]: I1205 16:18:12.977521 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-gdlgn"] Dec 05 16:18:13 crc kubenswrapper[4778]: I1205 16:18:13.270584 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:18:13 crc kubenswrapper[4778]: W1205 16:18:13.274707 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod156cc8f8_5c0b_46c9_ae14_3a3493473f75.slice/crio-45d9069d109a9d2b04db39805ba67cb6f94d35f38b9714309b46d0021032913f WatchSource:0}: Error finding container 45d9069d109a9d2b04db39805ba67cb6f94d35f38b9714309b46d0021032913f: Status 404 returned error can't find the container with id 45d9069d109a9d2b04db39805ba67cb6f94d35f38b9714309b46d0021032913f Dec 05 16:18:13 crc kubenswrapper[4778]: I1205 16:18:13.277931 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:18:13 crc kubenswrapper[4778]: I1205 16:18:13.966111 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" event={"ID":"9028e391-6a57-4546-8f3c-8a0eecc5f053","Type":"ContainerStarted","Data":"18fc5d350d6ae1bbf416c9c3933499a56ae3a4cd5e84d67a95a739c1922899e6"} Dec 05 16:18:13 crc kubenswrapper[4778]: I1205 16:18:13.966439 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" event={"ID":"9028e391-6a57-4546-8f3c-8a0eecc5f053","Type":"ContainerStarted","Data":"f407d55c1baa8b9586417ccb29b6deb3005ce7a07155510f8111482659e21521"} Dec 05 16:18:13 crc kubenswrapper[4778]: I1205 16:18:13.969500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"156cc8f8-5c0b-46c9-ae14-3a3493473f75","Type":"ContainerStarted","Data":"45d9069d109a9d2b04db39805ba67cb6f94d35f38b9714309b46d0021032913f"} Dec 05 16:18:13 crc kubenswrapper[4778]: I1205 16:18:13.988107 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" podStartSLOduration=1.98808023 podStartE2EDuration="1.98808023s" podCreationTimestamp="2025-12-05 16:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:13.979732574 +0000 UTC m=+1381.083528954" watchObservedRunningTime="2025-12-05 16:18:13.98808023 +0000 UTC m=+1381.091876620" Dec 05 16:18:14 crc kubenswrapper[4778]: I1205 16:18:14.639394 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:18:16 crc kubenswrapper[4778]: I1205 16:18:16.997634 4778 generic.go:334] "Generic (PLEG): container finished" podID="9028e391-6a57-4546-8f3c-8a0eecc5f053" containerID="18fc5d350d6ae1bbf416c9c3933499a56ae3a4cd5e84d67a95a739c1922899e6" exitCode=0 Dec 05 16:18:16 crc kubenswrapper[4778]: I1205 16:18:16.998561 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" event={"ID":"9028e391-6a57-4546-8f3c-8a0eecc5f053","Type":"ContainerDied","Data":"18fc5d350d6ae1bbf416c9c3933499a56ae3a4cd5e84d67a95a739c1922899e6"} Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.317249 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.456728 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-config-data\") pod \"9028e391-6a57-4546-8f3c-8a0eecc5f053\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.457103 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-fernet-keys\") pod \"9028e391-6a57-4546-8f3c-8a0eecc5f053\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.457132 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-scripts\") pod \"9028e391-6a57-4546-8f3c-8a0eecc5f053\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.457185 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-combined-ca-bundle\") pod \"9028e391-6a57-4546-8f3c-8a0eecc5f053\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.457235 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-credential-keys\") pod \"9028e391-6a57-4546-8f3c-8a0eecc5f053\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.457350 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgnwk\" (UniqueName: \"kubernetes.io/projected/9028e391-6a57-4546-8f3c-8a0eecc5f053-kube-api-access-lgnwk\") pod \"9028e391-6a57-4546-8f3c-8a0eecc5f053\" (UID: \"9028e391-6a57-4546-8f3c-8a0eecc5f053\") " Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.462406 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9028e391-6a57-4546-8f3c-8a0eecc5f053-kube-api-access-lgnwk" (OuterVolumeSpecName: "kube-api-access-lgnwk") pod "9028e391-6a57-4546-8f3c-8a0eecc5f053" (UID: "9028e391-6a57-4546-8f3c-8a0eecc5f053"). InnerVolumeSpecName "kube-api-access-lgnwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.462452 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-scripts" (OuterVolumeSpecName: "scripts") pod "9028e391-6a57-4546-8f3c-8a0eecc5f053" (UID: "9028e391-6a57-4546-8f3c-8a0eecc5f053"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.462422 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9028e391-6a57-4546-8f3c-8a0eecc5f053" (UID: "9028e391-6a57-4546-8f3c-8a0eecc5f053"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.462972 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9028e391-6a57-4546-8f3c-8a0eecc5f053" (UID: "9028e391-6a57-4546-8f3c-8a0eecc5f053"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.478499 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-config-data" (OuterVolumeSpecName: "config-data") pod "9028e391-6a57-4546-8f3c-8a0eecc5f053" (UID: "9028e391-6a57-4546-8f3c-8a0eecc5f053"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.482640 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9028e391-6a57-4546-8f3c-8a0eecc5f053" (UID: "9028e391-6a57-4546-8f3c-8a0eecc5f053"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.559824 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgnwk\" (UniqueName: \"kubernetes.io/projected/9028e391-6a57-4546-8f3c-8a0eecc5f053-kube-api-access-lgnwk\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.559853 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.559862 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.559871 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.559880 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:18 crc kubenswrapper[4778]: I1205 16:18:18.559891 4778 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9028e391-6a57-4546-8f3c-8a0eecc5f053-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.013621 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" event={"ID":"9028e391-6a57-4546-8f3c-8a0eecc5f053","Type":"ContainerDied","Data":"f407d55c1baa8b9586417ccb29b6deb3005ce7a07155510f8111482659e21521"} Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.013655 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f407d55c1baa8b9586417ccb29b6deb3005ce7a07155510f8111482659e21521" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.013991 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-gdlgn" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.015198 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"156cc8f8-5c0b-46c9-ae14-3a3493473f75","Type":"ContainerStarted","Data":"7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d"} Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.112842 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-gdlgn"] Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.124245 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-gdlgn"] Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.210245 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-fsmts"] Dec 05 16:18:19 crc kubenswrapper[4778]: E1205 16:18:19.210660 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9028e391-6a57-4546-8f3c-8a0eecc5f053" containerName="keystone-bootstrap" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.210675 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9028e391-6a57-4546-8f3c-8a0eecc5f053" containerName="keystone-bootstrap" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.210879 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9028e391-6a57-4546-8f3c-8a0eecc5f053" containerName="keystone-bootstrap" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.211467 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.213848 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"osp-secret" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.213909 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.213996 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.214157 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.214832 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-s5hnz" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.233020 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-fsmts"] Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.266142 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9028e391-6a57-4546-8f3c-8a0eecc5f053" path="/var/lib/kubelet/pods/9028e391-6a57-4546-8f3c-8a0eecc5f053/volumes" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.371702 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-combined-ca-bundle\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.371843 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-scripts\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.371877 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-config-data\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.373049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-credential-keys\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.373082 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfdhn\" (UniqueName: \"kubernetes.io/projected/cd92a3b6-617a-4962-b18c-50ab46cbfe54-kube-api-access-lfdhn\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.373110 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-fernet-keys\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.474033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-fernet-keys\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.474375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-combined-ca-bundle\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.474497 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-scripts\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.474534 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-config-data\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.474583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-credential-keys\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.474613 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdhn\" (UniqueName: \"kubernetes.io/projected/cd92a3b6-617a-4962-b18c-50ab46cbfe54-kube-api-access-lfdhn\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.480596 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-combined-ca-bundle\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.480671 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-fernet-keys\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.481260 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-config-data\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.481403 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-scripts\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.482093 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-credential-keys\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.492434 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfdhn\" (UniqueName: \"kubernetes.io/projected/cd92a3b6-617a-4962-b18c-50ab46cbfe54-kube-api-access-lfdhn\") pod \"keystone-bootstrap-fsmts\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:19 crc kubenswrapper[4778]: I1205 16:18:19.533757 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:20 crc kubenswrapper[4778]: I1205 16:18:20.182343 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-fsmts"] Dec 05 16:18:22 crc kubenswrapper[4778]: I1205 16:18:22.039820 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-fsmts" event={"ID":"cd92a3b6-617a-4962-b18c-50ab46cbfe54","Type":"ContainerStarted","Data":"b94d7768f2620eb17a6e6d0bdf0494f7a48628df8ecd656e000cdd771d23e956"} Dec 05 16:18:22 crc kubenswrapper[4778]: I1205 16:18:22.040365 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-fsmts" event={"ID":"cd92a3b6-617a-4962-b18c-50ab46cbfe54","Type":"ContainerStarted","Data":"333b295ec01de84e895d11d80a296fc2abaf4d6450bddd4f708b6857662f3dd3"} Dec 05 16:18:22 crc kubenswrapper[4778]: I1205 16:18:22.042748 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"156cc8f8-5c0b-46c9-ae14-3a3493473f75","Type":"ContainerStarted","Data":"bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44"} Dec 05 16:18:22 crc kubenswrapper[4778]: I1205 16:18:22.066948 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-fsmts" podStartSLOduration=3.066928196 podStartE2EDuration="3.066928196s" podCreationTimestamp="2025-12-05 16:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:22.05956058 +0000 UTC m=+1389.163356960" watchObservedRunningTime="2025-12-05 16:18:22.066928196 +0000 UTC m=+1389.170724576" Dec 05 16:18:26 crc kubenswrapper[4778]: I1205 16:18:26.088552 4778 generic.go:334] "Generic (PLEG): container finished" podID="cd92a3b6-617a-4962-b18c-50ab46cbfe54" containerID="b94d7768f2620eb17a6e6d0bdf0494f7a48628df8ecd656e000cdd771d23e956" exitCode=0 Dec 05 16:18:26 crc kubenswrapper[4778]: I1205 16:18:26.088644 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-fsmts" event={"ID":"cd92a3b6-617a-4962-b18c-50ab46cbfe54","Type":"ContainerDied","Data":"b94d7768f2620eb17a6e6d0bdf0494f7a48628df8ecd656e000cdd771d23e956"} Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.341989 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.363476 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-scripts\") pod \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.363571 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-credential-keys\") pod \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.363597 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-config-data\") pod \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.363645 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfdhn\" (UniqueName: \"kubernetes.io/projected/cd92a3b6-617a-4962-b18c-50ab46cbfe54-kube-api-access-lfdhn\") pod \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.363674 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-combined-ca-bundle\") pod \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.363695 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-fernet-keys\") pod \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\" (UID: \"cd92a3b6-617a-4962-b18c-50ab46cbfe54\") " Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.371088 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cd92a3b6-617a-4962-b18c-50ab46cbfe54" (UID: "cd92a3b6-617a-4962-b18c-50ab46cbfe54"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.380583 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-scripts" (OuterVolumeSpecName: "scripts") pod "cd92a3b6-617a-4962-b18c-50ab46cbfe54" (UID: "cd92a3b6-617a-4962-b18c-50ab46cbfe54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.383622 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cd92a3b6-617a-4962-b18c-50ab46cbfe54" (UID: "cd92a3b6-617a-4962-b18c-50ab46cbfe54"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.387996 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd92a3b6-617a-4962-b18c-50ab46cbfe54-kube-api-access-lfdhn" (OuterVolumeSpecName: "kube-api-access-lfdhn") pod "cd92a3b6-617a-4962-b18c-50ab46cbfe54" (UID: "cd92a3b6-617a-4962-b18c-50ab46cbfe54"). InnerVolumeSpecName "kube-api-access-lfdhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.392709 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-config-data" (OuterVolumeSpecName: "config-data") pod "cd92a3b6-617a-4962-b18c-50ab46cbfe54" (UID: "cd92a3b6-617a-4962-b18c-50ab46cbfe54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.400224 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd92a3b6-617a-4962-b18c-50ab46cbfe54" (UID: "cd92a3b6-617a-4962-b18c-50ab46cbfe54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.465171 4778 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.465488 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.465504 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfdhn\" (UniqueName: \"kubernetes.io/projected/cd92a3b6-617a-4962-b18c-50ab46cbfe54-kube-api-access-lfdhn\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.465521 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.465536 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:29 crc kubenswrapper[4778]: I1205 16:18:29.465547 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd92a3b6-617a-4962-b18c-50ab46cbfe54-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.122239 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-fsmts" event={"ID":"cd92a3b6-617a-4962-b18c-50ab46cbfe54","Type":"ContainerDied","Data":"333b295ec01de84e895d11d80a296fc2abaf4d6450bddd4f708b6857662f3dd3"} Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.122317 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="333b295ec01de84e895d11d80a296fc2abaf4d6450bddd4f708b6857662f3dd3" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.122664 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-fsmts" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.124583 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"156cc8f8-5c0b-46c9-ae14-3a3493473f75","Type":"ContainerStarted","Data":"16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6"} Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.458973 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-7dc59c9b94-xvw4w"] Dec 05 16:18:30 crc kubenswrapper[4778]: E1205 16:18:30.459294 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd92a3b6-617a-4962-b18c-50ab46cbfe54" containerName="keystone-bootstrap" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.459307 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd92a3b6-617a-4962-b18c-50ab46cbfe54" containerName="keystone-bootstrap" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.459504 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd92a3b6-617a-4962-b18c-50ab46cbfe54" containerName="keystone-bootstrap" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.460141 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.463199 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.463380 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-s5hnz" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.463985 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.464525 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-keystone-public-svc" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.464761 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.464974 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-keystone-internal-svc" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.476889 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-7dc59c9b94-xvw4w"] Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.480352 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-fernet-keys\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.480427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-public-tls-certs\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.480476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-internal-tls-certs\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.480513 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28s25\" (UniqueName: \"kubernetes.io/projected/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-kube-api-access-28s25\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.480548 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-credential-keys\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.480571 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-scripts\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.480654 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-combined-ca-bundle\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.480698 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-config-data\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.581864 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-public-tls-certs\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.581933 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-internal-tls-certs\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.581968 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28s25\" (UniqueName: \"kubernetes.io/projected/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-kube-api-access-28s25\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.581987 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-credential-keys\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.582006 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-scripts\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.582027 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-combined-ca-bundle\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.582065 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-config-data\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.582092 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-fernet-keys\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.587304 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-combined-ca-bundle\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.590955 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-internal-tls-certs\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.591137 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-credential-keys\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.591533 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-scripts\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.593227 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-fernet-keys\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.595825 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-config-data\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.595962 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-public-tls-certs\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.610997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28s25\" (UniqueName: \"kubernetes.io/projected/fb21dc96-1d4b-4116-95d2-659fc1daa3cd-kube-api-access-28s25\") pod \"keystone-7dc59c9b94-xvw4w\" (UID: \"fb21dc96-1d4b-4116-95d2-659fc1daa3cd\") " pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.774598 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.897854 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fz5jp"] Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.946245 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fz5jp"] Dec 05 16:18:30 crc kubenswrapper[4778]: I1205 16:18:30.946367 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:31 crc kubenswrapper[4778]: I1205 16:18:31.088902 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8k5q\" (UniqueName: \"kubernetes.io/projected/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-kube-api-access-x8k5q\") pod \"certified-operators-fz5jp\" (UID: \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\") " pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:31 crc kubenswrapper[4778]: I1205 16:18:31.089289 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-utilities\") pod \"certified-operators-fz5jp\" (UID: \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\") " pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:31 crc kubenswrapper[4778]: I1205 16:18:31.089321 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-catalog-content\") pod \"certified-operators-fz5jp\" (UID: \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\") " pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:31 crc kubenswrapper[4778]: I1205 16:18:31.192465 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-utilities\") pod \"certified-operators-fz5jp\" (UID: \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\") " pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:31 crc kubenswrapper[4778]: I1205 16:18:31.192525 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-catalog-content\") pod \"certified-operators-fz5jp\" (UID: \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\") " pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:31 crc kubenswrapper[4778]: I1205 16:18:31.192691 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8k5q\" (UniqueName: \"kubernetes.io/projected/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-kube-api-access-x8k5q\") pod \"certified-operators-fz5jp\" (UID: \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\") " pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:31 crc kubenswrapper[4778]: I1205 16:18:31.193279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-catalog-content\") pod \"certified-operators-fz5jp\" (UID: \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\") " pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:31 crc kubenswrapper[4778]: I1205 16:18:31.193299 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-utilities\") pod \"certified-operators-fz5jp\" (UID: \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\") " pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:31 crc kubenswrapper[4778]: I1205 16:18:31.223120 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8k5q\" (UniqueName: \"kubernetes.io/projected/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-kube-api-access-x8k5q\") pod \"certified-operators-fz5jp\" (UID: \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\") " pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:31 crc kubenswrapper[4778]: I1205 16:18:31.283121 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:31 crc kubenswrapper[4778]: I1205 16:18:31.366379 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-7dc59c9b94-xvw4w"] Dec 05 16:18:31 crc kubenswrapper[4778]: W1205 16:18:31.376258 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb21dc96_1d4b_4116_95d2_659fc1daa3cd.slice/crio-5710d985dd31240da2283e0f5bdf0ee876695a76a6b4861add9d45846259f998 WatchSource:0}: Error finding container 5710d985dd31240da2283e0f5bdf0ee876695a76a6b4861add9d45846259f998: Status 404 returned error can't find the container with id 5710d985dd31240da2283e0f5bdf0ee876695a76a6b4861add9d45846259f998 Dec 05 16:18:31 crc kubenswrapper[4778]: I1205 16:18:31.860132 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fz5jp"] Dec 05 16:18:32 crc kubenswrapper[4778]: I1205 16:18:32.147431 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fz5jp" event={"ID":"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae","Type":"ContainerStarted","Data":"596489c4da7a3a0be16bfeed00bd567ddff57dc153e3203be2424be6e141e12f"} Dec 05 16:18:32 crc kubenswrapper[4778]: I1205 16:18:32.148978 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" event={"ID":"fb21dc96-1d4b-4116-95d2-659fc1daa3cd","Type":"ContainerStarted","Data":"f4e671147dcd92c69a02ef7e5a3100d9c3e8998fd9bece7a25acc599b4f934ea"} Dec 05 16:18:32 crc kubenswrapper[4778]: I1205 16:18:32.149022 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" event={"ID":"fb21dc96-1d4b-4116-95d2-659fc1daa3cd","Type":"ContainerStarted","Data":"5710d985dd31240da2283e0f5bdf0ee876695a76a6b4861add9d45846259f998"} Dec 05 16:18:32 crc kubenswrapper[4778]: I1205 16:18:32.149075 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:18:32 crc kubenswrapper[4778]: I1205 16:18:32.178332 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" podStartSLOduration=2.178313291 podStartE2EDuration="2.178313291s" podCreationTimestamp="2025-12-05 16:18:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:32.172181323 +0000 UTC m=+1399.275977723" watchObservedRunningTime="2025-12-05 16:18:32.178313291 +0000 UTC m=+1399.282109671" Dec 05 16:18:34 crc kubenswrapper[4778]: I1205 16:18:34.170285 4778 generic.go:334] "Generic (PLEG): container finished" podID="8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" containerID="a46b55b856928913d01a683c7038813c2c9f93901df5011d78a4e39a3a04b320" exitCode=0 Dec 05 16:18:34 crc kubenswrapper[4778]: I1205 16:18:34.170396 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fz5jp" event={"ID":"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae","Type":"ContainerDied","Data":"a46b55b856928913d01a683c7038813c2c9f93901df5011d78a4e39a3a04b320"} Dec 05 16:18:41 crc kubenswrapper[4778]: I1205 16:18:41.225152 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"156cc8f8-5c0b-46c9-ae14-3a3493473f75","Type":"ContainerStarted","Data":"2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec"} Dec 05 16:18:41 crc kubenswrapper[4778]: I1205 16:18:41.225257 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="ceilometer-central-agent" containerID="cri-o://7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d" gracePeriod=30 Dec 05 16:18:41 crc kubenswrapper[4778]: I1205 16:18:41.225292 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="sg-core" containerID="cri-o://16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6" gracePeriod=30 Dec 05 16:18:41 crc kubenswrapper[4778]: I1205 16:18:41.225329 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="ceilometer-notification-agent" containerID="cri-o://bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44" gracePeriod=30 Dec 05 16:18:41 crc kubenswrapper[4778]: I1205 16:18:41.225340 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="proxy-httpd" containerID="cri-o://2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec" gracePeriod=30 Dec 05 16:18:41 crc kubenswrapper[4778]: I1205 16:18:41.228854 4778 generic.go:334] "Generic (PLEG): container finished" podID="8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" containerID="bb13550fbe7970923eeaddbc0b74c92c94bf3ca6b23a3612069330c9ef2fd0ac" exitCode=0 Dec 05 16:18:41 crc kubenswrapper[4778]: I1205 16:18:41.230088 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:41 crc kubenswrapper[4778]: I1205 16:18:41.230236 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fz5jp" event={"ID":"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae","Type":"ContainerDied","Data":"bb13550fbe7970923eeaddbc0b74c92c94bf3ca6b23a3612069330c9ef2fd0ac"} Dec 05 16:18:41 crc kubenswrapper[4778]: I1205 16:18:41.254770 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.164184316 podStartE2EDuration="29.254749734s" podCreationTimestamp="2025-12-05 16:18:12 +0000 UTC" firstStartedPulling="2025-12-05 16:18:13.277639061 +0000 UTC m=+1380.381435441" lastFinishedPulling="2025-12-05 16:18:40.368204479 +0000 UTC m=+1407.472000859" observedRunningTime="2025-12-05 16:18:41.248680348 +0000 UTC m=+1408.352476738" watchObservedRunningTime="2025-12-05 16:18:41.254749734 +0000 UTC m=+1408.358546124" Dec 05 16:18:42 crc kubenswrapper[4778]: I1205 16:18:42.240319 4778 generic.go:334] "Generic (PLEG): container finished" podID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerID="2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec" exitCode=0 Dec 05 16:18:42 crc kubenswrapper[4778]: I1205 16:18:42.240641 4778 generic.go:334] "Generic (PLEG): container finished" podID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerID="16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6" exitCode=2 Dec 05 16:18:42 crc kubenswrapper[4778]: I1205 16:18:42.240652 4778 generic.go:334] "Generic (PLEG): container finished" podID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerID="7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d" exitCode=0 Dec 05 16:18:42 crc kubenswrapper[4778]: I1205 16:18:42.240351 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"156cc8f8-5c0b-46c9-ae14-3a3493473f75","Type":"ContainerDied","Data":"2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec"} Dec 05 16:18:42 crc kubenswrapper[4778]: I1205 16:18:42.240717 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"156cc8f8-5c0b-46c9-ae14-3a3493473f75","Type":"ContainerDied","Data":"16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6"} Dec 05 16:18:42 crc kubenswrapper[4778]: I1205 16:18:42.240733 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"156cc8f8-5c0b-46c9-ae14-3a3493473f75","Type":"ContainerDied","Data":"7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d"} Dec 05 16:18:42 crc kubenswrapper[4778]: I1205 16:18:42.243040 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fz5jp" event={"ID":"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae","Type":"ContainerStarted","Data":"41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a"} Dec 05 16:18:42 crc kubenswrapper[4778]: I1205 16:18:42.263656 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fz5jp" podStartSLOduration=6.867330097 podStartE2EDuration="12.263637387s" podCreationTimestamp="2025-12-05 16:18:30 +0000 UTC" firstStartedPulling="2025-12-05 16:18:36.574947849 +0000 UTC m=+1403.678744229" lastFinishedPulling="2025-12-05 16:18:41.971255149 +0000 UTC m=+1409.075051519" observedRunningTime="2025-12-05 16:18:42.259848391 +0000 UTC m=+1409.363644801" watchObservedRunningTime="2025-12-05 16:18:42.263637387 +0000 UTC m=+1409.367433777" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.755511 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.857247 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-sg-core-conf-yaml\") pod \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.857331 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvll9\" (UniqueName: \"kubernetes.io/projected/156cc8f8-5c0b-46c9-ae14-3a3493473f75-kube-api-access-gvll9\") pod \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.857412 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-config-data\") pod \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.857462 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/156cc8f8-5c0b-46c9-ae14-3a3493473f75-log-httpd\") pod \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.857552 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-scripts\") pod \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.858110 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/156cc8f8-5c0b-46c9-ae14-3a3493473f75-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "156cc8f8-5c0b-46c9-ae14-3a3493473f75" (UID: "156cc8f8-5c0b-46c9-ae14-3a3493473f75"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.858212 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-combined-ca-bundle\") pod \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.858668 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/156cc8f8-5c0b-46c9-ae14-3a3493473f75-run-httpd\") pod \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\" (UID: \"156cc8f8-5c0b-46c9-ae14-3a3493473f75\") " Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.858926 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/156cc8f8-5c0b-46c9-ae14-3a3493473f75-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "156cc8f8-5c0b-46c9-ae14-3a3493473f75" (UID: "156cc8f8-5c0b-46c9-ae14-3a3493473f75"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.859229 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/156cc8f8-5c0b-46c9-ae14-3a3493473f75-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.859245 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/156cc8f8-5c0b-46c9-ae14-3a3493473f75-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.863045 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-scripts" (OuterVolumeSpecName: "scripts") pod "156cc8f8-5c0b-46c9-ae14-3a3493473f75" (UID: "156cc8f8-5c0b-46c9-ae14-3a3493473f75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.868435 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156cc8f8-5c0b-46c9-ae14-3a3493473f75-kube-api-access-gvll9" (OuterVolumeSpecName: "kube-api-access-gvll9") pod "156cc8f8-5c0b-46c9-ae14-3a3493473f75" (UID: "156cc8f8-5c0b-46c9-ae14-3a3493473f75"). InnerVolumeSpecName "kube-api-access-gvll9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.891481 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "156cc8f8-5c0b-46c9-ae14-3a3493473f75" (UID: "156cc8f8-5c0b-46c9-ae14-3a3493473f75"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.950231 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-config-data" (OuterVolumeSpecName: "config-data") pod "156cc8f8-5c0b-46c9-ae14-3a3493473f75" (UID: "156cc8f8-5c0b-46c9-ae14-3a3493473f75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.956032 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "156cc8f8-5c0b-46c9-ae14-3a3493473f75" (UID: "156cc8f8-5c0b-46c9-ae14-3a3493473f75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.960693 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.960745 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvll9\" (UniqueName: \"kubernetes.io/projected/156cc8f8-5c0b-46c9-ae14-3a3493473f75-kube-api-access-gvll9\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.960758 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.960767 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:43 crc kubenswrapper[4778]: I1205 16:18:43.960775 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156cc8f8-5c0b-46c9-ae14-3a3493473f75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.261263 4778 generic.go:334] "Generic (PLEG): container finished" podID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerID="bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44" exitCode=0 Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.261313 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"156cc8f8-5c0b-46c9-ae14-3a3493473f75","Type":"ContainerDied","Data":"bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44"} Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.261349 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"156cc8f8-5c0b-46c9-ae14-3a3493473f75","Type":"ContainerDied","Data":"45d9069d109a9d2b04db39805ba67cb6f94d35f38b9714309b46d0021032913f"} Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.261391 4778 scope.go:117] "RemoveContainer" containerID="2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.261390 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.276869 4778 scope.go:117] "RemoveContainer" containerID="16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.291319 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.296642 4778 scope.go:117] "RemoveContainer" containerID="bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.298086 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.325560 4778 scope.go:117] "RemoveContainer" containerID="7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.328329 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:18:44 crc kubenswrapper[4778]: E1205 16:18:44.328813 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="ceilometer-central-agent" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.328847 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="ceilometer-central-agent" Dec 05 16:18:44 crc kubenswrapper[4778]: E1205 16:18:44.328876 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="ceilometer-notification-agent" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.328888 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="ceilometer-notification-agent" Dec 05 16:18:44 crc kubenswrapper[4778]: E1205 16:18:44.328919 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="sg-core" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.328931 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="sg-core" Dec 05 16:18:44 crc kubenswrapper[4778]: E1205 16:18:44.328965 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="proxy-httpd" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.328977 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="proxy-httpd" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.329253 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="ceilometer-central-agent" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.329278 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="sg-core" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.329305 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="proxy-httpd" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.329328 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" containerName="ceilometer-notification-agent" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.331721 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.334274 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.335081 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.346992 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.357777 4778 scope.go:117] "RemoveContainer" containerID="2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec" Dec 05 16:18:44 crc kubenswrapper[4778]: E1205 16:18:44.358212 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec\": container with ID starting with 2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec not found: ID does not exist" containerID="2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.358247 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec"} err="failed to get container status \"2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec\": rpc error: code = NotFound desc = could not find container \"2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec\": container with ID starting with 2a39c8716b77302a8d0ba961ac626aba2bb0834fb1711e32d50a00daeb8098ec not found: ID does not exist" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.358274 4778 scope.go:117] "RemoveContainer" containerID="16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6" Dec 05 16:18:44 crc kubenswrapper[4778]: E1205 16:18:44.358641 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6\": container with ID starting with 16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6 not found: ID does not exist" containerID="16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.358701 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6"} err="failed to get container status \"16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6\": rpc error: code = NotFound desc = could not find container \"16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6\": container with ID starting with 16b1a415c7a143e805de276fe0d6b639fe17fea6fbd609f79f463bff7237a7d6 not found: ID does not exist" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.358733 4778 scope.go:117] "RemoveContainer" containerID="bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44" Dec 05 16:18:44 crc kubenswrapper[4778]: E1205 16:18:44.359942 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44\": container with ID starting with bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44 not found: ID does not exist" containerID="bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.360002 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44"} err="failed to get container status \"bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44\": rpc error: code = NotFound desc = could not find container \"bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44\": container with ID starting with bd8147fe7e41d58ac404cecb215b63b371f917bacb4db439ed5d0d03ec18ce44 not found: ID does not exist" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.360040 4778 scope.go:117] "RemoveContainer" containerID="7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d" Dec 05 16:18:44 crc kubenswrapper[4778]: E1205 16:18:44.360524 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d\": container with ID starting with 7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d not found: ID does not exist" containerID="7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.360577 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d"} err="failed to get container status \"7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d\": rpc error: code = NotFound desc = could not find container \"7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d\": container with ID starting with 7c1f0e8c4539d16773a242bec8c023ff75983e865e47c2b30f4487ec366b4d1d not found: ID does not exist" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.366484 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbb3480-c839-40fc-8c46-02d607401e7b-run-httpd\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.366574 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-scripts\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.366708 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.366818 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blc57\" (UniqueName: \"kubernetes.io/projected/7bbb3480-c839-40fc-8c46-02d607401e7b-kube-api-access-blc57\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.366858 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbb3480-c839-40fc-8c46-02d607401e7b-log-httpd\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.366899 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-config-data\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.367071 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.469097 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.469174 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbb3480-c839-40fc-8c46-02d607401e7b-run-httpd\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.469195 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-scripts\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.469227 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.469265 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blc57\" (UniqueName: \"kubernetes.io/projected/7bbb3480-c839-40fc-8c46-02d607401e7b-kube-api-access-blc57\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.469285 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbb3480-c839-40fc-8c46-02d607401e7b-log-httpd\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.469301 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-config-data\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.470245 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbb3480-c839-40fc-8c46-02d607401e7b-run-httpd\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.470460 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbb3480-c839-40fc-8c46-02d607401e7b-log-httpd\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.473511 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.474342 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-config-data\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.474608 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-scripts\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.476008 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.491921 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blc57\" (UniqueName: \"kubernetes.io/projected/7bbb3480-c839-40fc-8c46-02d607401e7b-kube-api-access-blc57\") pod \"ceilometer-0\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:44 crc kubenswrapper[4778]: I1205 16:18:44.657965 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:45 crc kubenswrapper[4778]: I1205 16:18:45.090503 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:18:45 crc kubenswrapper[4778]: W1205 16:18:45.093640 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bbb3480_c839_40fc_8c46_02d607401e7b.slice/crio-cd21dfeaee2fe0ca30e61569126e2c07af7b4f97a0431b927ac6c8d7bbf3a1fe WatchSource:0}: Error finding container cd21dfeaee2fe0ca30e61569126e2c07af7b4f97a0431b927ac6c8d7bbf3a1fe: Status 404 returned error can't find the container with id cd21dfeaee2fe0ca30e61569126e2c07af7b4f97a0431b927ac6c8d7bbf3a1fe Dec 05 16:18:45 crc kubenswrapper[4778]: I1205 16:18:45.261585 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="156cc8f8-5c0b-46c9-ae14-3a3493473f75" path="/var/lib/kubelet/pods/156cc8f8-5c0b-46c9-ae14-3a3493473f75/volumes" Dec 05 16:18:45 crc kubenswrapper[4778]: I1205 16:18:45.274339 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7bbb3480-c839-40fc-8c46-02d607401e7b","Type":"ContainerStarted","Data":"cd21dfeaee2fe0ca30e61569126e2c07af7b4f97a0431b927ac6c8d7bbf3a1fe"} Dec 05 16:18:47 crc kubenswrapper[4778]: I1205 16:18:47.294587 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7bbb3480-c839-40fc-8c46-02d607401e7b","Type":"ContainerStarted","Data":"c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39"} Dec 05 16:18:48 crc kubenswrapper[4778]: I1205 16:18:48.310877 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7bbb3480-c839-40fc-8c46-02d607401e7b","Type":"ContainerStarted","Data":"6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2"} Dec 05 16:18:48 crc kubenswrapper[4778]: I1205 16:18:48.891184 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5xmv2"] Dec 05 16:18:48 crc kubenswrapper[4778]: I1205 16:18:48.893645 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:18:48 crc kubenswrapper[4778]: I1205 16:18:48.911947 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5xmv2"] Dec 05 16:18:49 crc kubenswrapper[4778]: I1205 16:18:49.095930 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-utilities\") pod \"redhat-operators-5xmv2\" (UID: \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\") " pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:18:49 crc kubenswrapper[4778]: I1205 16:18:49.096031 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2vpl\" (UniqueName: \"kubernetes.io/projected/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-kube-api-access-s2vpl\") pod \"redhat-operators-5xmv2\" (UID: \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\") " pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:18:49 crc kubenswrapper[4778]: I1205 16:18:49.096081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-catalog-content\") pod \"redhat-operators-5xmv2\" (UID: \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\") " pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:18:49 crc kubenswrapper[4778]: I1205 16:18:49.197785 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2vpl\" (UniqueName: \"kubernetes.io/projected/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-kube-api-access-s2vpl\") pod \"redhat-operators-5xmv2\" (UID: \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\") " pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:18:49 crc kubenswrapper[4778]: I1205 16:18:49.197841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-catalog-content\") pod \"redhat-operators-5xmv2\" (UID: \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\") " pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:18:49 crc kubenswrapper[4778]: I1205 16:18:49.197934 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-utilities\") pod \"redhat-operators-5xmv2\" (UID: \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\") " pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:18:49 crc kubenswrapper[4778]: I1205 16:18:49.198489 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-utilities\") pod \"redhat-operators-5xmv2\" (UID: \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\") " pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:18:49 crc kubenswrapper[4778]: I1205 16:18:49.198516 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-catalog-content\") pod \"redhat-operators-5xmv2\" (UID: \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\") " pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:18:49 crc kubenswrapper[4778]: I1205 16:18:49.224250 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2vpl\" (UniqueName: \"kubernetes.io/projected/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-kube-api-access-s2vpl\") pod \"redhat-operators-5xmv2\" (UID: \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\") " pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:18:49 crc kubenswrapper[4778]: I1205 16:18:49.327459 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7bbb3480-c839-40fc-8c46-02d607401e7b","Type":"ContainerStarted","Data":"555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c"} Dec 05 16:18:49 crc kubenswrapper[4778]: I1205 16:18:49.513615 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:18:49 crc kubenswrapper[4778]: I1205 16:18:49.971280 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5xmv2"] Dec 05 16:18:49 crc kubenswrapper[4778]: W1205 16:18:49.975303 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00066a67_563b_4b32_a1fb_fdf5f4de8f6c.slice/crio-33a26cf3fec156bb162297e4cc4f59240143dc3c814dc9bd3fb1343406394553 WatchSource:0}: Error finding container 33a26cf3fec156bb162297e4cc4f59240143dc3c814dc9bd3fb1343406394553: Status 404 returned error can't find the container with id 33a26cf3fec156bb162297e4cc4f59240143dc3c814dc9bd3fb1343406394553 Dec 05 16:18:50 crc kubenswrapper[4778]: I1205 16:18:50.336938 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xmv2" event={"ID":"00066a67-563b-4b32-a1fb-fdf5f4de8f6c","Type":"ContainerStarted","Data":"33a26cf3fec156bb162297e4cc4f59240143dc3c814dc9bd3fb1343406394553"} Dec 05 16:18:51 crc kubenswrapper[4778]: I1205 16:18:51.283893 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:51 crc kubenswrapper[4778]: I1205 16:18:51.284178 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:51 crc kubenswrapper[4778]: I1205 16:18:51.335148 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:51 crc kubenswrapper[4778]: I1205 16:18:51.350593 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7bbb3480-c839-40fc-8c46-02d607401e7b","Type":"ContainerStarted","Data":"e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad"} Dec 05 16:18:51 crc kubenswrapper[4778]: I1205 16:18:51.350773 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:18:51 crc kubenswrapper[4778]: I1205 16:18:51.352772 4778 generic.go:334] "Generic (PLEG): container finished" podID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" containerID="6debf307eb26529bb19a42baddfbaded594b95ee512e2a5a99a7458a4e8fe1fc" exitCode=0 Dec 05 16:18:51 crc kubenswrapper[4778]: I1205 16:18:51.352883 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xmv2" event={"ID":"00066a67-563b-4b32-a1fb-fdf5f4de8f6c","Type":"ContainerDied","Data":"6debf307eb26529bb19a42baddfbaded594b95ee512e2a5a99a7458a4e8fe1fc"} Dec 05 16:18:51 crc kubenswrapper[4778]: I1205 16:18:51.398097 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.640750578 podStartE2EDuration="7.398075278s" podCreationTimestamp="2025-12-05 16:18:44 +0000 UTC" firstStartedPulling="2025-12-05 16:18:45.095818719 +0000 UTC m=+1412.199615099" lastFinishedPulling="2025-12-05 16:18:49.853143419 +0000 UTC m=+1416.956939799" observedRunningTime="2025-12-05 16:18:51.375345461 +0000 UTC m=+1418.479141851" watchObservedRunningTime="2025-12-05 16:18:51.398075278 +0000 UTC m=+1418.501871648" Dec 05 16:18:51 crc kubenswrapper[4778]: I1205 16:18:51.419436 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:52 crc kubenswrapper[4778]: I1205 16:18:52.361890 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xmv2" event={"ID":"00066a67-563b-4b32-a1fb-fdf5f4de8f6c","Type":"ContainerStarted","Data":"eeb240335d0b51ce5a0dbbb214955417f246da762581e95408fd43f8e2073381"} Dec 05 16:18:54 crc kubenswrapper[4778]: I1205 16:18:54.377455 4778 generic.go:334] "Generic (PLEG): container finished" podID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" containerID="eeb240335d0b51ce5a0dbbb214955417f246da762581e95408fd43f8e2073381" exitCode=0 Dec 05 16:18:54 crc kubenswrapper[4778]: I1205 16:18:54.377625 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xmv2" event={"ID":"00066a67-563b-4b32-a1fb-fdf5f4de8f6c","Type":"ContainerDied","Data":"eeb240335d0b51ce5a0dbbb214955417f246da762581e95408fd43f8e2073381"} Dec 05 16:18:54 crc kubenswrapper[4778]: I1205 16:18:54.684530 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fz5jp"] Dec 05 16:18:54 crc kubenswrapper[4778]: I1205 16:18:54.684721 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fz5jp" podUID="8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" containerName="registry-server" containerID="cri-o://41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a" gracePeriod=2 Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.131917 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.302231 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8k5q\" (UniqueName: \"kubernetes.io/projected/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-kube-api-access-x8k5q\") pod \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\" (UID: \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\") " Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.303126 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-utilities\") pod \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\" (UID: \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\") " Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.303225 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-catalog-content\") pod \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\" (UID: \"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae\") " Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.304408 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-utilities" (OuterVolumeSpecName: "utilities") pod "8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" (UID: "8c9242f7-8012-4b87-9e16-fb9a95cdd9ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.316672 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-kube-api-access-x8k5q" (OuterVolumeSpecName: "kube-api-access-x8k5q") pod "8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" (UID: "8c9242f7-8012-4b87-9e16-fb9a95cdd9ae"). InnerVolumeSpecName "kube-api-access-x8k5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.354926 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" (UID: "8c9242f7-8012-4b87-9e16-fb9a95cdd9ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.403123 4778 generic.go:334] "Generic (PLEG): container finished" podID="8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" containerID="41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a" exitCode=0 Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.403201 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fz5jp" event={"ID":"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae","Type":"ContainerDied","Data":"41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a"} Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.403230 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fz5jp" event={"ID":"8c9242f7-8012-4b87-9e16-fb9a95cdd9ae","Type":"ContainerDied","Data":"596489c4da7a3a0be16bfeed00bd567ddff57dc153e3203be2424be6e141e12f"} Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.403246 4778 scope.go:117] "RemoveContainer" containerID="41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.403392 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fz5jp" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.405980 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.406015 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.406031 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8k5q\" (UniqueName: \"kubernetes.io/projected/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae-kube-api-access-x8k5q\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.408121 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xmv2" event={"ID":"00066a67-563b-4b32-a1fb-fdf5f4de8f6c","Type":"ContainerStarted","Data":"dc17d22259eb2080b5f7e23de2ae00cd8104e849840c32af318ff4b3c74c6b6d"} Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.427891 4778 scope.go:117] "RemoveContainer" containerID="bb13550fbe7970923eeaddbc0b74c92c94bf3ca6b23a3612069330c9ef2fd0ac" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.440971 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5xmv2" podStartSLOduration=4.04654468 podStartE2EDuration="7.440954928s" podCreationTimestamp="2025-12-05 16:18:48 +0000 UTC" firstStartedPulling="2025-12-05 16:18:51.354374908 +0000 UTC m=+1418.458171288" lastFinishedPulling="2025-12-05 16:18:54.748785166 +0000 UTC m=+1421.852581536" observedRunningTime="2025-12-05 16:18:55.434550338 +0000 UTC m=+1422.538346728" watchObservedRunningTime="2025-12-05 16:18:55.440954928 +0000 UTC m=+1422.544751308" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.465667 4778 scope.go:117] "RemoveContainer" containerID="a46b55b856928913d01a683c7038813c2c9f93901df5011d78a4e39a3a04b320" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.470931 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fz5jp"] Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.480050 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fz5jp"] Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.487819 4778 scope.go:117] "RemoveContainer" containerID="41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a" Dec 05 16:18:55 crc kubenswrapper[4778]: E1205 16:18:55.489906 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a\": container with ID starting with 41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a not found: ID does not exist" containerID="41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.489941 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a"} err="failed to get container status \"41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a\": rpc error: code = NotFound desc = could not find container \"41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a\": container with ID starting with 41e0720293be2307c4a96e1a675d0ba62e87b4080e5e105417fa5f3d91ea783a not found: ID does not exist" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.489964 4778 scope.go:117] "RemoveContainer" containerID="bb13550fbe7970923eeaddbc0b74c92c94bf3ca6b23a3612069330c9ef2fd0ac" Dec 05 16:18:55 crc kubenswrapper[4778]: E1205 16:18:55.490294 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb13550fbe7970923eeaddbc0b74c92c94bf3ca6b23a3612069330c9ef2fd0ac\": container with ID starting with bb13550fbe7970923eeaddbc0b74c92c94bf3ca6b23a3612069330c9ef2fd0ac not found: ID does not exist" containerID="bb13550fbe7970923eeaddbc0b74c92c94bf3ca6b23a3612069330c9ef2fd0ac" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.490320 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb13550fbe7970923eeaddbc0b74c92c94bf3ca6b23a3612069330c9ef2fd0ac"} err="failed to get container status \"bb13550fbe7970923eeaddbc0b74c92c94bf3ca6b23a3612069330c9ef2fd0ac\": rpc error: code = NotFound desc = could not find container \"bb13550fbe7970923eeaddbc0b74c92c94bf3ca6b23a3612069330c9ef2fd0ac\": container with ID starting with bb13550fbe7970923eeaddbc0b74c92c94bf3ca6b23a3612069330c9ef2fd0ac not found: ID does not exist" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.490337 4778 scope.go:117] "RemoveContainer" containerID="a46b55b856928913d01a683c7038813c2c9f93901df5011d78a4e39a3a04b320" Dec 05 16:18:55 crc kubenswrapper[4778]: E1205 16:18:55.490742 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a46b55b856928913d01a683c7038813c2c9f93901df5011d78a4e39a3a04b320\": container with ID starting with a46b55b856928913d01a683c7038813c2c9f93901df5011d78a4e39a3a04b320 not found: ID does not exist" containerID="a46b55b856928913d01a683c7038813c2c9f93901df5011d78a4e39a3a04b320" Dec 05 16:18:55 crc kubenswrapper[4778]: I1205 16:18:55.490870 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a46b55b856928913d01a683c7038813c2c9f93901df5011d78a4e39a3a04b320"} err="failed to get container status \"a46b55b856928913d01a683c7038813c2c9f93901df5011d78a4e39a3a04b320\": rpc error: code = NotFound desc = could not find container \"a46b55b856928913d01a683c7038813c2c9f93901df5011d78a4e39a3a04b320\": container with ID starting with a46b55b856928913d01a683c7038813c2c9f93901df5011d78a4e39a3a04b320 not found: ID does not exist" Dec 05 16:18:57 crc kubenswrapper[4778]: I1205 16:18:57.260356 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" path="/var/lib/kubelet/pods/8c9242f7-8012-4b87-9e16-fb9a95cdd9ae/volumes" Dec 05 16:18:59 crc kubenswrapper[4778]: I1205 16:18:59.514554 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:18:59 crc kubenswrapper[4778]: I1205 16:18:59.515489 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:19:00 crc kubenswrapper[4778]: I1205 16:19:00.569407 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5xmv2" podUID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" containerName="registry-server" probeResult="failure" output=< Dec 05 16:19:00 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Dec 05 16:19:00 crc kubenswrapper[4778]: > Dec 05 16:19:02 crc kubenswrapper[4778]: I1205 16:19:02.575924 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/keystone-7dc59c9b94-xvw4w" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.846029 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstackclient"] Dec 05 16:19:06 crc kubenswrapper[4778]: E1205 16:19:06.847067 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" containerName="extract-utilities" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.847084 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" containerName="extract-utilities" Dec 05 16:19:06 crc kubenswrapper[4778]: E1205 16:19:06.847098 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" containerName="extract-content" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.847106 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" containerName="extract-content" Dec 05 16:19:06 crc kubenswrapper[4778]: E1205 16:19:06.847119 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" containerName="registry-server" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.847127 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" containerName="registry-server" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.847314 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c9242f7-8012-4b87-9e16-fb9a95cdd9ae" containerName="registry-server" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.848012 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.850315 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstack-config-secret" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.850577 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstackclient-openstackclient-dockercfg-tx7d4" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.850773 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-config" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.859058 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.997867 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8\") " pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.997912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkxcw\" (UniqueName: \"kubernetes.io/projected/ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8-kube-api-access-hkxcw\") pod \"openstackclient\" (UID: \"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8\") " pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.997935 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8-openstack-config-secret\") pod \"openstackclient\" (UID: \"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8\") " pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:06 crc kubenswrapper[4778]: I1205 16:19:06.997981 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8-openstack-config\") pod \"openstackclient\" (UID: \"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8\") " pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:07 crc kubenswrapper[4778]: I1205 16:19:07.099619 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8\") " pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:07 crc kubenswrapper[4778]: I1205 16:19:07.099679 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkxcw\" (UniqueName: \"kubernetes.io/projected/ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8-kube-api-access-hkxcw\") pod \"openstackclient\" (UID: \"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8\") " pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:07 crc kubenswrapper[4778]: I1205 16:19:07.099703 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8-openstack-config-secret\") pod \"openstackclient\" (UID: \"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8\") " pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:07 crc kubenswrapper[4778]: I1205 16:19:07.099734 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8-openstack-config\") pod \"openstackclient\" (UID: \"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8\") " pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:07 crc kubenswrapper[4778]: I1205 16:19:07.100633 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8-openstack-config\") pod \"openstackclient\" (UID: \"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8\") " pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:07 crc kubenswrapper[4778]: I1205 16:19:07.105073 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8-openstack-config-secret\") pod \"openstackclient\" (UID: \"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8\") " pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:07 crc kubenswrapper[4778]: I1205 16:19:07.106489 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8\") " pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:07 crc kubenswrapper[4778]: I1205 16:19:07.129733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkxcw\" (UniqueName: \"kubernetes.io/projected/ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8-kube-api-access-hkxcw\") pod \"openstackclient\" (UID: \"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8\") " pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:07 crc kubenswrapper[4778]: I1205 16:19:07.166763 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Dec 05 16:19:07 crc kubenswrapper[4778]: I1205 16:19:07.674283 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Dec 05 16:19:08 crc kubenswrapper[4778]: I1205 16:19:08.510388 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstackclient" event={"ID":"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8","Type":"ContainerStarted","Data":"31db53002b45d804e41bd9a49aa0013fee542407624624d84760ce154d6f80c0"} Dec 05 16:19:09 crc kubenswrapper[4778]: I1205 16:19:09.575695 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:19:09 crc kubenswrapper[4778]: I1205 16:19:09.627730 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:19:13 crc kubenswrapper[4778]: I1205 16:19:13.085398 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5xmv2"] Dec 05 16:19:13 crc kubenswrapper[4778]: I1205 16:19:13.085903 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5xmv2" podUID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" containerName="registry-server" containerID="cri-o://dc17d22259eb2080b5f7e23de2ae00cd8104e849840c32af318ff4b3c74c6b6d" gracePeriod=2 Dec 05 16:19:13 crc kubenswrapper[4778]: I1205 16:19:13.563495 4778 generic.go:334] "Generic (PLEG): container finished" podID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" containerID="dc17d22259eb2080b5f7e23de2ae00cd8104e849840c32af318ff4b3c74c6b6d" exitCode=0 Dec 05 16:19:13 crc kubenswrapper[4778]: I1205 16:19:13.563871 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xmv2" event={"ID":"00066a67-563b-4b32-a1fb-fdf5f4de8f6c","Type":"ContainerDied","Data":"dc17d22259eb2080b5f7e23de2ae00cd8104e849840c32af318ff4b3c74c6b6d"} Dec 05 16:19:14 crc kubenswrapper[4778]: I1205 16:19:14.662086 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.412894 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.413561 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="01b4e1d6-1a7d-4113-8954-278cfe2d60c3" containerName="kube-state-metrics" containerID="cri-o://9f9cc9f620d6ce4a31aae4d3bb7a095d64b1b1333ce9f316acde898ebd684071" gracePeriod=30 Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.611204 4778 generic.go:334] "Generic (PLEG): container finished" podID="01b4e1d6-1a7d-4113-8954-278cfe2d60c3" containerID="9f9cc9f620d6ce4a31aae4d3bb7a095d64b1b1333ce9f316acde898ebd684071" exitCode=2 Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.611291 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"01b4e1d6-1a7d-4113-8954-278cfe2d60c3","Type":"ContainerDied","Data":"9f9cc9f620d6ce4a31aae4d3bb7a095d64b1b1333ce9f316acde898ebd684071"} Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.665992 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.783742 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.804180 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2vpl\" (UniqueName: \"kubernetes.io/projected/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-kube-api-access-s2vpl\") pod \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\" (UID: \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\") " Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.804292 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-catalog-content\") pod \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\" (UID: \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\") " Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.804323 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-utilities\") pod \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\" (UID: \"00066a67-563b-4b32-a1fb-fdf5f4de8f6c\") " Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.805698 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-utilities" (OuterVolumeSpecName: "utilities") pod "00066a67-563b-4b32-a1fb-fdf5f4de8f6c" (UID: "00066a67-563b-4b32-a1fb-fdf5f4de8f6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.831889 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-kube-api-access-s2vpl" (OuterVolumeSpecName: "kube-api-access-s2vpl") pod "00066a67-563b-4b32-a1fb-fdf5f4de8f6c" (UID: "00066a67-563b-4b32-a1fb-fdf5f4de8f6c"). InnerVolumeSpecName "kube-api-access-s2vpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.905968 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw9hg\" (UniqueName: \"kubernetes.io/projected/01b4e1d6-1a7d-4113-8954-278cfe2d60c3-kube-api-access-hw9hg\") pod \"01b4e1d6-1a7d-4113-8954-278cfe2d60c3\" (UID: \"01b4e1d6-1a7d-4113-8954-278cfe2d60c3\") " Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.906574 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.906594 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2vpl\" (UniqueName: \"kubernetes.io/projected/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-kube-api-access-s2vpl\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.909379 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b4e1d6-1a7d-4113-8954-278cfe2d60c3-kube-api-access-hw9hg" (OuterVolumeSpecName: "kube-api-access-hw9hg") pod "01b4e1d6-1a7d-4113-8954-278cfe2d60c3" (UID: "01b4e1d6-1a7d-4113-8954-278cfe2d60c3"). InnerVolumeSpecName "kube-api-access-hw9hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:19:17 crc kubenswrapper[4778]: I1205 16:19:17.912988 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00066a67-563b-4b32-a1fb-fdf5f4de8f6c" (UID: "00066a67-563b-4b32-a1fb-fdf5f4de8f6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.007533 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00066a67-563b-4b32-a1fb-fdf5f4de8f6c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.007572 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw9hg\" (UniqueName: \"kubernetes.io/projected/01b4e1d6-1a7d-4113-8954-278cfe2d60c3-kube-api-access-hw9hg\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.464809 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.465163 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="ceilometer-notification-agent" containerID="cri-o://6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2" gracePeriod=30 Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.465169 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="sg-core" containerID="cri-o://555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c" gracePeriod=30 Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.465288 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="proxy-httpd" containerID="cri-o://e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad" gracePeriod=30 Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.465134 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="ceilometer-central-agent" containerID="cri-o://c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39" gracePeriod=30 Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.621012 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xmv2" event={"ID":"00066a67-563b-4b32-a1fb-fdf5f4de8f6c","Type":"ContainerDied","Data":"33a26cf3fec156bb162297e4cc4f59240143dc3c814dc9bd3fb1343406394553"} Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.621073 4778 scope.go:117] "RemoveContainer" containerID="dc17d22259eb2080b5f7e23de2ae00cd8104e849840c32af318ff4b3c74c6b6d" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.621186 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5xmv2" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.629065 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"01b4e1d6-1a7d-4113-8954-278cfe2d60c3","Type":"ContainerDied","Data":"fe7d71a7226f273fb87fe7bdfd3890b738940c36456ded2b9115c90710544894"} Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.629164 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.637991 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstackclient" event={"ID":"ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8","Type":"ContainerStarted","Data":"92fa330224bde729d08276c56b5555ae567130bba048218767f87d731f5d6be3"} Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.644870 4778 scope.go:117] "RemoveContainer" containerID="eeb240335d0b51ce5a0dbbb214955417f246da762581e95408fd43f8e2073381" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.646790 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerID="e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad" exitCode=0 Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.646884 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerID="555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c" exitCode=2 Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.646913 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7bbb3480-c839-40fc-8c46-02d607401e7b","Type":"ContainerDied","Data":"e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad"} Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.646965 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7bbb3480-c839-40fc-8c46-02d607401e7b","Type":"ContainerDied","Data":"555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c"} Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.669010 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/openstackclient" podStartSLOduration=2.949557865 podStartE2EDuration="12.66899452s" podCreationTimestamp="2025-12-05 16:19:06 +0000 UTC" firstStartedPulling="2025-12-05 16:19:07.678477144 +0000 UTC m=+1434.782273514" lastFinishedPulling="2025-12-05 16:19:17.397913789 +0000 UTC m=+1444.501710169" observedRunningTime="2025-12-05 16:19:18.662291311 +0000 UTC m=+1445.766087711" watchObservedRunningTime="2025-12-05 16:19:18.66899452 +0000 UTC m=+1445.772790900" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.687043 4778 scope.go:117] "RemoveContainer" containerID="6debf307eb26529bb19a42baddfbaded594b95ee512e2a5a99a7458a4e8fe1fc" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.695687 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5xmv2"] Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.721013 4778 scope.go:117] "RemoveContainer" containerID="9f9cc9f620d6ce4a31aae4d3bb7a095d64b1b1333ce9f316acde898ebd684071" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.722480 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5xmv2"] Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.733112 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.743427 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.753554 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 16:19:18 crc kubenswrapper[4778]: E1205 16:19:18.753990 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" containerName="extract-utilities" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.754007 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" containerName="extract-utilities" Dec 05 16:19:18 crc kubenswrapper[4778]: E1205 16:19:18.754021 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b4e1d6-1a7d-4113-8954-278cfe2d60c3" containerName="kube-state-metrics" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.754028 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b4e1d6-1a7d-4113-8954-278cfe2d60c3" containerName="kube-state-metrics" Dec 05 16:19:18 crc kubenswrapper[4778]: E1205 16:19:18.754039 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" containerName="registry-server" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.754045 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" containerName="registry-server" Dec 05 16:19:18 crc kubenswrapper[4778]: E1205 16:19:18.754058 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" containerName="extract-content" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.754065 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" containerName="extract-content" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.754227 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b4e1d6-1a7d-4113-8954-278cfe2d60c3" containerName="kube-state-metrics" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.754241 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" containerName="registry-server" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.754820 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.758793 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-kube-state-metrics-svc" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.760651 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"kube-state-metrics-tls-config" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.761705 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.821326 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4fz8\" (UniqueName: \"kubernetes.io/projected/04eaf4e3-1e04-4041-9721-d2b1bfcb44c9-kube-api-access-l4fz8\") pod \"kube-state-metrics-0\" (UID: \"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.821396 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/04eaf4e3-1e04-4041-9721-d2b1bfcb44c9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.821420 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04eaf4e3-1e04-4041-9721-d2b1bfcb44c9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.821438 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/04eaf4e3-1e04-4041-9721-d2b1bfcb44c9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.923166 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4fz8\" (UniqueName: \"kubernetes.io/projected/04eaf4e3-1e04-4041-9721-d2b1bfcb44c9-kube-api-access-l4fz8\") pod \"kube-state-metrics-0\" (UID: \"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.923607 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/04eaf4e3-1e04-4041-9721-d2b1bfcb44c9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.923662 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04eaf4e3-1e04-4041-9721-d2b1bfcb44c9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.923721 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/04eaf4e3-1e04-4041-9721-d2b1bfcb44c9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.939767 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/04eaf4e3-1e04-4041-9721-d2b1bfcb44c9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.939829 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/04eaf4e3-1e04-4041-9721-d2b1bfcb44c9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.944847 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4fz8\" (UniqueName: \"kubernetes.io/projected/04eaf4e3-1e04-4041-9721-d2b1bfcb44c9-kube-api-access-l4fz8\") pod \"kube-state-metrics-0\" (UID: \"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:18 crc kubenswrapper[4778]: I1205 16:19:18.955489 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04eaf4e3-1e04-4041-9721-d2b1bfcb44c9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.099771 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.167923 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.228598 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blc57\" (UniqueName: \"kubernetes.io/projected/7bbb3480-c839-40fc-8c46-02d607401e7b-kube-api-access-blc57\") pod \"7bbb3480-c839-40fc-8c46-02d607401e7b\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.228655 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbb3480-c839-40fc-8c46-02d607401e7b-log-httpd\") pod \"7bbb3480-c839-40fc-8c46-02d607401e7b\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.228730 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbb3480-c839-40fc-8c46-02d607401e7b-run-httpd\") pod \"7bbb3480-c839-40fc-8c46-02d607401e7b\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.228781 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-config-data\") pod \"7bbb3480-c839-40fc-8c46-02d607401e7b\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.228838 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-combined-ca-bundle\") pod \"7bbb3480-c839-40fc-8c46-02d607401e7b\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.228883 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-scripts\") pod \"7bbb3480-c839-40fc-8c46-02d607401e7b\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.228908 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-sg-core-conf-yaml\") pod \"7bbb3480-c839-40fc-8c46-02d607401e7b\" (UID: \"7bbb3480-c839-40fc-8c46-02d607401e7b\") " Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.229432 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbb3480-c839-40fc-8c46-02d607401e7b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7bbb3480-c839-40fc-8c46-02d607401e7b" (UID: "7bbb3480-c839-40fc-8c46-02d607401e7b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.229769 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbb3480-c839-40fc-8c46-02d607401e7b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7bbb3480-c839-40fc-8c46-02d607401e7b" (UID: "7bbb3480-c839-40fc-8c46-02d607401e7b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.236122 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbb3480-c839-40fc-8c46-02d607401e7b-kube-api-access-blc57" (OuterVolumeSpecName: "kube-api-access-blc57") pod "7bbb3480-c839-40fc-8c46-02d607401e7b" (UID: "7bbb3480-c839-40fc-8c46-02d607401e7b"). InnerVolumeSpecName "kube-api-access-blc57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.236487 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-scripts" (OuterVolumeSpecName: "scripts") pod "7bbb3480-c839-40fc-8c46-02d607401e7b" (UID: "7bbb3480-c839-40fc-8c46-02d607401e7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.256868 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7bbb3480-c839-40fc-8c46-02d607401e7b" (UID: "7bbb3480-c839-40fc-8c46-02d607401e7b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.283445 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00066a67-563b-4b32-a1fb-fdf5f4de8f6c" path="/var/lib/kubelet/pods/00066a67-563b-4b32-a1fb-fdf5f4de8f6c/volumes" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.285264 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01b4e1d6-1a7d-4113-8954-278cfe2d60c3" path="/var/lib/kubelet/pods/01b4e1d6-1a7d-4113-8954-278cfe2d60c3/volumes" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.339850 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.339885 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.339899 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blc57\" (UniqueName: \"kubernetes.io/projected/7bbb3480-c839-40fc-8c46-02d607401e7b-kube-api-access-blc57\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.339912 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbb3480-c839-40fc-8c46-02d607401e7b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.339922 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbb3480-c839-40fc-8c46-02d607401e7b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.359983 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bbb3480-c839-40fc-8c46-02d607401e7b" (UID: "7bbb3480-c839-40fc-8c46-02d607401e7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.362353 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-config-data" (OuterVolumeSpecName: "config-data") pod "7bbb3480-c839-40fc-8c46-02d607401e7b" (UID: "7bbb3480-c839-40fc-8c46-02d607401e7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.441760 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.441795 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbb3480-c839-40fc-8c46-02d607401e7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.624174 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 16:19:19 crc kubenswrapper[4778]: W1205 16:19:19.631327 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04eaf4e3_1e04_4041_9721_d2b1bfcb44c9.slice/crio-6ec060c629fe630e0ed68128629537c989f1a633dab447a67704b840ff2c2712 WatchSource:0}: Error finding container 6ec060c629fe630e0ed68128629537c989f1a633dab447a67704b840ff2c2712: Status 404 returned error can't find the container with id 6ec060c629fe630e0ed68128629537c989f1a633dab447a67704b840ff2c2712 Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.654662 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9","Type":"ContainerStarted","Data":"6ec060c629fe630e0ed68128629537c989f1a633dab447a67704b840ff2c2712"} Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.662529 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerID="6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2" exitCode=0 Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.662558 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerID="c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39" exitCode=0 Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.662650 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7bbb3480-c839-40fc-8c46-02d607401e7b","Type":"ContainerDied","Data":"6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2"} Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.662694 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7bbb3480-c839-40fc-8c46-02d607401e7b","Type":"ContainerDied","Data":"c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39"} Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.662705 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7bbb3480-c839-40fc-8c46-02d607401e7b","Type":"ContainerDied","Data":"cd21dfeaee2fe0ca30e61569126e2c07af7b4f97a0431b927ac6c8d7bbf3a1fe"} Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.662720 4778 scope.go:117] "RemoveContainer" containerID="e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.662783 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.695529 4778 scope.go:117] "RemoveContainer" containerID="555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.708294 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.714679 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.721002 4778 scope.go:117] "RemoveContainer" containerID="6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.749281 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:19:19 crc kubenswrapper[4778]: E1205 16:19:19.750799 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="sg-core" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.750915 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="sg-core" Dec 05 16:19:19 crc kubenswrapper[4778]: E1205 16:19:19.751002 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="proxy-httpd" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.751082 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="proxy-httpd" Dec 05 16:19:19 crc kubenswrapper[4778]: E1205 16:19:19.751148 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="ceilometer-central-agent" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.752166 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="ceilometer-central-agent" Dec 05 16:19:19 crc kubenswrapper[4778]: E1205 16:19:19.752260 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="ceilometer-notification-agent" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.752332 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="ceilometer-notification-agent" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.752763 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="ceilometer-central-agent" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.752851 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="proxy-httpd" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.752924 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="ceilometer-notification-agent" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.753006 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" containerName="sg-core" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.763588 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.772574 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.772889 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.772954 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.777819 4778 scope.go:117] "RemoveContainer" containerID="c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.786975 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.805644 4778 scope.go:117] "RemoveContainer" containerID="e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad" Dec 05 16:19:19 crc kubenswrapper[4778]: E1205 16:19:19.806083 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad\": container with ID starting with e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad not found: ID does not exist" containerID="e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.806120 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad"} err="failed to get container status \"e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad\": rpc error: code = NotFound desc = could not find container \"e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad\": container with ID starting with e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad not found: ID does not exist" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.806146 4778 scope.go:117] "RemoveContainer" containerID="555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c" Dec 05 16:19:19 crc kubenswrapper[4778]: E1205 16:19:19.806354 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c\": container with ID starting with 555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c not found: ID does not exist" containerID="555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.806409 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c"} err="failed to get container status \"555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c\": rpc error: code = NotFound desc = could not find container \"555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c\": container with ID starting with 555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c not found: ID does not exist" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.806426 4778 scope.go:117] "RemoveContainer" containerID="6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2" Dec 05 16:19:19 crc kubenswrapper[4778]: E1205 16:19:19.809848 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2\": container with ID starting with 6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2 not found: ID does not exist" containerID="6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.809963 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2"} err="failed to get container status \"6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2\": rpc error: code = NotFound desc = could not find container \"6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2\": container with ID starting with 6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2 not found: ID does not exist" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.810042 4778 scope.go:117] "RemoveContainer" containerID="c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39" Dec 05 16:19:19 crc kubenswrapper[4778]: E1205 16:19:19.810695 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39\": container with ID starting with c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39 not found: ID does not exist" containerID="c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.810818 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39"} err="failed to get container status \"c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39\": rpc error: code = NotFound desc = could not find container \"c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39\": container with ID starting with c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39 not found: ID does not exist" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.810916 4778 scope.go:117] "RemoveContainer" containerID="e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.814715 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad"} err="failed to get container status \"e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad\": rpc error: code = NotFound desc = could not find container \"e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad\": container with ID starting with e1423c5edddd6024e45abc4e88279f1da749a427ac314296efe2a611ce68d9ad not found: ID does not exist" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.814762 4778 scope.go:117] "RemoveContainer" containerID="555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.815136 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c"} err="failed to get container status \"555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c\": rpc error: code = NotFound desc = could not find container \"555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c\": container with ID starting with 555dbba4b94a2f374c3efe3ada4dced6113b92be93c6d46fca6f0f3752f62f8c not found: ID does not exist" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.815162 4778 scope.go:117] "RemoveContainer" containerID="6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.815421 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2"} err="failed to get container status \"6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2\": rpc error: code = NotFound desc = could not find container \"6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2\": container with ID starting with 6ea560a9f831d1be4426d888c91479d63850329aa0cf50c9df2ffd517a0078b2 not found: ID does not exist" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.815447 4778 scope.go:117] "RemoveContainer" containerID="c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.815688 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39"} err="failed to get container status \"c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39\": rpc error: code = NotFound desc = could not find container \"c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39\": container with ID starting with c3cb828a08f762171789acd86a6051b30f463344e915feda3f764c8a0681de39 not found: ID does not exist" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.848905 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.848963 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-scripts\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.849154 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-config-data\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.849260 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.849281 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gthv\" (UniqueName: \"kubernetes.io/projected/e8c2e058-ef3f-40d8-b946-6b1539e3491e-kube-api-access-5gthv\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.849400 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.849455 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2e058-ef3f-40d8-b946-6b1539e3491e-run-httpd\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.849522 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2e058-ef3f-40d8-b946-6b1539e3491e-log-httpd\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.951008 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.951065 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gthv\" (UniqueName: \"kubernetes.io/projected/e8c2e058-ef3f-40d8-b946-6b1539e3491e-kube-api-access-5gthv\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.951120 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.951155 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2e058-ef3f-40d8-b946-6b1539e3491e-run-httpd\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.951195 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2e058-ef3f-40d8-b946-6b1539e3491e-log-httpd\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.951264 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.951293 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-scripts\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.951318 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-config-data\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.951742 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2e058-ef3f-40d8-b946-6b1539e3491e-log-httpd\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.952095 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2e058-ef3f-40d8-b946-6b1539e3491e-run-httpd\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.963732 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.963754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-scripts\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.963766 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.964459 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-config-data\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.967933 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:19 crc kubenswrapper[4778]: I1205 16:19:19.974116 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gthv\" (UniqueName: \"kubernetes.io/projected/e8c2e058-ef3f-40d8-b946-6b1539e3491e-kube-api-access-5gthv\") pod \"ceilometer-0\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:20 crc kubenswrapper[4778]: I1205 16:19:20.096677 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:20 crc kubenswrapper[4778]: I1205 16:19:20.679455 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"04eaf4e3-1e04-4041-9721-d2b1bfcb44c9","Type":"ContainerStarted","Data":"6068b5d0b9098d302cfe190609ccfe06bbfbd611b3a260bc15aca853475a0f10"} Dec 05 16:19:20 crc kubenswrapper[4778]: I1205 16:19:20.679893 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:20 crc kubenswrapper[4778]: I1205 16:19:20.705252 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/kube-state-metrics-0" podStartSLOduration=2.329494205 podStartE2EDuration="2.705232725s" podCreationTimestamp="2025-12-05 16:19:18 +0000 UTC" firstStartedPulling="2025-12-05 16:19:19.635487713 +0000 UTC m=+1446.739284103" lastFinishedPulling="2025-12-05 16:19:20.011226243 +0000 UTC m=+1447.115022623" observedRunningTime="2025-12-05 16:19:20.697224402 +0000 UTC m=+1447.801020792" watchObservedRunningTime="2025-12-05 16:19:20.705232725 +0000 UTC m=+1447.809029105" Dec 05 16:19:21 crc kubenswrapper[4778]: I1205 16:19:21.265713 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbb3480-c839-40fc-8c46-02d607401e7b" path="/var/lib/kubelet/pods/7bbb3480-c839-40fc-8c46-02d607401e7b/volumes" Dec 05 16:19:21 crc kubenswrapper[4778]: I1205 16:19:21.299118 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:19:21 crc kubenswrapper[4778]: I1205 16:19:21.691269 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8c2e058-ef3f-40d8-b946-6b1539e3491e","Type":"ContainerStarted","Data":"a9fbae26e68ace4be17c5665a6540c018ae6143cdacff48677ed8f5596aa77ba"} Dec 05 16:19:22 crc kubenswrapper[4778]: I1205 16:19:22.670840 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="01b4e1d6-1a7d-4113-8954-278cfe2d60c3" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 16:19:22 crc kubenswrapper[4778]: I1205 16:19:22.700289 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8c2e058-ef3f-40d8-b946-6b1539e3491e","Type":"ContainerStarted","Data":"b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753"} Dec 05 16:19:22 crc kubenswrapper[4778]: I1205 16:19:22.700338 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8c2e058-ef3f-40d8-b946-6b1539e3491e","Type":"ContainerStarted","Data":"e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e"} Dec 05 16:19:22 crc kubenswrapper[4778]: I1205 16:19:22.933051 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-2006-account-create-update-f2bbc"] Dec 05 16:19:22 crc kubenswrapper[4778]: I1205 16:19:22.934357 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" Dec 05 16:19:22 crc kubenswrapper[4778]: I1205 16:19:22.938229 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 16:19:22 crc kubenswrapper[4778]: I1205 16:19:22.943405 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-kxr4q"] Dec 05 16:19:22 crc kubenswrapper[4778]: I1205 16:19:22.944586 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-kxr4q" Dec 05 16:19:22 crc kubenswrapper[4778]: I1205 16:19:22.952714 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-2006-account-create-update-f2bbc"] Dec 05 16:19:22 crc kubenswrapper[4778]: I1205 16:19:22.960374 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-kxr4q"] Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.019135 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b79508-5cee-451c-8622-95bbc1a98a14-operator-scripts\") pod \"watcher-2006-account-create-update-f2bbc\" (UID: \"a5b79508-5cee-451c-8622-95bbc1a98a14\") " pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.019508 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrd6b\" (UniqueName: \"kubernetes.io/projected/a5b79508-5cee-451c-8622-95bbc1a98a14-kube-api-access-jrd6b\") pod \"watcher-2006-account-create-update-f2bbc\" (UID: \"a5b79508-5cee-451c-8622-95bbc1a98a14\") " pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.121337 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrd6b\" (UniqueName: \"kubernetes.io/projected/a5b79508-5cee-451c-8622-95bbc1a98a14-kube-api-access-jrd6b\") pod \"watcher-2006-account-create-update-f2bbc\" (UID: \"a5b79508-5cee-451c-8622-95bbc1a98a14\") " pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.121429 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3337ad4-afc6-43ab-bcad-98583fbed9bf-operator-scripts\") pod \"watcher-db-create-kxr4q\" (UID: \"a3337ad4-afc6-43ab-bcad-98583fbed9bf\") " pod="watcher-kuttl-default/watcher-db-create-kxr4q" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.121465 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc692\" (UniqueName: \"kubernetes.io/projected/a3337ad4-afc6-43ab-bcad-98583fbed9bf-kube-api-access-wc692\") pod \"watcher-db-create-kxr4q\" (UID: \"a3337ad4-afc6-43ab-bcad-98583fbed9bf\") " pod="watcher-kuttl-default/watcher-db-create-kxr4q" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.121500 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b79508-5cee-451c-8622-95bbc1a98a14-operator-scripts\") pod \"watcher-2006-account-create-update-f2bbc\" (UID: \"a5b79508-5cee-451c-8622-95bbc1a98a14\") " pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.122074 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b79508-5cee-451c-8622-95bbc1a98a14-operator-scripts\") pod \"watcher-2006-account-create-update-f2bbc\" (UID: \"a5b79508-5cee-451c-8622-95bbc1a98a14\") " pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.144776 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrd6b\" (UniqueName: \"kubernetes.io/projected/a5b79508-5cee-451c-8622-95bbc1a98a14-kube-api-access-jrd6b\") pod \"watcher-2006-account-create-update-f2bbc\" (UID: \"a5b79508-5cee-451c-8622-95bbc1a98a14\") " pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.223608 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3337ad4-afc6-43ab-bcad-98583fbed9bf-operator-scripts\") pod \"watcher-db-create-kxr4q\" (UID: \"a3337ad4-afc6-43ab-bcad-98583fbed9bf\") " pod="watcher-kuttl-default/watcher-db-create-kxr4q" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.223686 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc692\" (UniqueName: \"kubernetes.io/projected/a3337ad4-afc6-43ab-bcad-98583fbed9bf-kube-api-access-wc692\") pod \"watcher-db-create-kxr4q\" (UID: \"a3337ad4-afc6-43ab-bcad-98583fbed9bf\") " pod="watcher-kuttl-default/watcher-db-create-kxr4q" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.224591 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3337ad4-afc6-43ab-bcad-98583fbed9bf-operator-scripts\") pod \"watcher-db-create-kxr4q\" (UID: \"a3337ad4-afc6-43ab-bcad-98583fbed9bf\") " pod="watcher-kuttl-default/watcher-db-create-kxr4q" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.239151 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc692\" (UniqueName: \"kubernetes.io/projected/a3337ad4-afc6-43ab-bcad-98583fbed9bf-kube-api-access-wc692\") pod \"watcher-db-create-kxr4q\" (UID: \"a3337ad4-afc6-43ab-bcad-98583fbed9bf\") " pod="watcher-kuttl-default/watcher-db-create-kxr4q" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.255894 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.268605 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-kxr4q" Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.619511 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-2006-account-create-update-f2bbc"] Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.714644 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-kxr4q"] Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.727715 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8c2e058-ef3f-40d8-b946-6b1539e3491e","Type":"ContainerStarted","Data":"c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91"} Dec 05 16:19:23 crc kubenswrapper[4778]: I1205 16:19:23.730739 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" event={"ID":"a5b79508-5cee-451c-8622-95bbc1a98a14","Type":"ContainerStarted","Data":"53410dc5bff59a60a4b907dd6e7ac7a4d831bdd4f9ad5e61a58e2becedd81e05"} Dec 05 16:19:23 crc kubenswrapper[4778]: W1205 16:19:23.775638 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3337ad4_afc6_43ab_bcad_98583fbed9bf.slice/crio-c64bda7814ee82a443a837ddc02e2e39a8d02dd5b171fa8ab03749b89d16c60b WatchSource:0}: Error finding container c64bda7814ee82a443a837ddc02e2e39a8d02dd5b171fa8ab03749b89d16c60b: Status 404 returned error can't find the container with id c64bda7814ee82a443a837ddc02e2e39a8d02dd5b171fa8ab03749b89d16c60b Dec 05 16:19:24 crc kubenswrapper[4778]: I1205 16:19:24.738784 4778 generic.go:334] "Generic (PLEG): container finished" podID="a3337ad4-afc6-43ab-bcad-98583fbed9bf" containerID="fcfbe6afb3bc84e41cb91c36beb0a756b1ad1d7ff69825fded05442a2a43afa7" exitCode=0 Dec 05 16:19:24 crc kubenswrapper[4778]: I1205 16:19:24.739073 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-kxr4q" event={"ID":"a3337ad4-afc6-43ab-bcad-98583fbed9bf","Type":"ContainerDied","Data":"fcfbe6afb3bc84e41cb91c36beb0a756b1ad1d7ff69825fded05442a2a43afa7"} Dec 05 16:19:24 crc kubenswrapper[4778]: I1205 16:19:24.739124 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-kxr4q" event={"ID":"a3337ad4-afc6-43ab-bcad-98583fbed9bf","Type":"ContainerStarted","Data":"c64bda7814ee82a443a837ddc02e2e39a8d02dd5b171fa8ab03749b89d16c60b"} Dec 05 16:19:24 crc kubenswrapper[4778]: I1205 16:19:24.741036 4778 generic.go:334] "Generic (PLEG): container finished" podID="a5b79508-5cee-451c-8622-95bbc1a98a14" containerID="dfc4c892a81342049d9ee0b5239096c2e5592c0c193dca1ea53413f885bd4a9b" exitCode=0 Dec 05 16:19:24 crc kubenswrapper[4778]: I1205 16:19:24.741062 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" event={"ID":"a5b79508-5cee-451c-8622-95bbc1a98a14","Type":"ContainerDied","Data":"dfc4c892a81342049d9ee0b5239096c2e5592c0c193dca1ea53413f885bd4a9b"} Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.095090 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-kxr4q" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.105087 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.275743 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrd6b\" (UniqueName: \"kubernetes.io/projected/a5b79508-5cee-451c-8622-95bbc1a98a14-kube-api-access-jrd6b\") pod \"a5b79508-5cee-451c-8622-95bbc1a98a14\" (UID: \"a5b79508-5cee-451c-8622-95bbc1a98a14\") " Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.275845 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3337ad4-afc6-43ab-bcad-98583fbed9bf-operator-scripts\") pod \"a3337ad4-afc6-43ab-bcad-98583fbed9bf\" (UID: \"a3337ad4-afc6-43ab-bcad-98583fbed9bf\") " Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.275905 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc692\" (UniqueName: \"kubernetes.io/projected/a3337ad4-afc6-43ab-bcad-98583fbed9bf-kube-api-access-wc692\") pod \"a3337ad4-afc6-43ab-bcad-98583fbed9bf\" (UID: \"a3337ad4-afc6-43ab-bcad-98583fbed9bf\") " Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.275967 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b79508-5cee-451c-8622-95bbc1a98a14-operator-scripts\") pod \"a5b79508-5cee-451c-8622-95bbc1a98a14\" (UID: \"a5b79508-5cee-451c-8622-95bbc1a98a14\") " Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.276798 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5b79508-5cee-451c-8622-95bbc1a98a14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5b79508-5cee-451c-8622-95bbc1a98a14" (UID: "a5b79508-5cee-451c-8622-95bbc1a98a14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.277075 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3337ad4-afc6-43ab-bcad-98583fbed9bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3337ad4-afc6-43ab-bcad-98583fbed9bf" (UID: "a3337ad4-afc6-43ab-bcad-98583fbed9bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.296775 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b79508-5cee-451c-8622-95bbc1a98a14-kube-api-access-jrd6b" (OuterVolumeSpecName: "kube-api-access-jrd6b") pod "a5b79508-5cee-451c-8622-95bbc1a98a14" (UID: "a5b79508-5cee-451c-8622-95bbc1a98a14"). InnerVolumeSpecName "kube-api-access-jrd6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.296901 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3337ad4-afc6-43ab-bcad-98583fbed9bf-kube-api-access-wc692" (OuterVolumeSpecName: "kube-api-access-wc692") pod "a3337ad4-afc6-43ab-bcad-98583fbed9bf" (UID: "a3337ad4-afc6-43ab-bcad-98583fbed9bf"). InnerVolumeSpecName "kube-api-access-wc692". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.377299 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b79508-5cee-451c-8622-95bbc1a98a14-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.377324 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrd6b\" (UniqueName: \"kubernetes.io/projected/a5b79508-5cee-451c-8622-95bbc1a98a14-kube-api-access-jrd6b\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.377335 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3337ad4-afc6-43ab-bcad-98583fbed9bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.377345 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc692\" (UniqueName: \"kubernetes.io/projected/a3337ad4-afc6-43ab-bcad-98583fbed9bf-kube-api-access-wc692\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.757944 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-kxr4q" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.757932 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-kxr4q" event={"ID":"a3337ad4-afc6-43ab-bcad-98583fbed9bf","Type":"ContainerDied","Data":"c64bda7814ee82a443a837ddc02e2e39a8d02dd5b171fa8ab03749b89d16c60b"} Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.757991 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c64bda7814ee82a443a837ddc02e2e39a8d02dd5b171fa8ab03749b89d16c60b" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.759686 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" event={"ID":"a5b79508-5cee-451c-8622-95bbc1a98a14","Type":"ContainerDied","Data":"53410dc5bff59a60a4b907dd6e7ac7a4d831bdd4f9ad5e61a58e2becedd81e05"} Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.759988 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53410dc5bff59a60a4b907dd6e7ac7a4d831bdd4f9ad5e61a58e2becedd81e05" Dec 05 16:19:26 crc kubenswrapper[4778]: I1205 16:19:26.759725 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2006-account-create-update-f2bbc" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.294164 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-45mpz"] Dec 05 16:19:28 crc kubenswrapper[4778]: E1205 16:19:28.294749 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3337ad4-afc6-43ab-bcad-98583fbed9bf" containerName="mariadb-database-create" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.294761 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3337ad4-afc6-43ab-bcad-98583fbed9bf" containerName="mariadb-database-create" Dec 05 16:19:28 crc kubenswrapper[4778]: E1205 16:19:28.294771 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b79508-5cee-451c-8622-95bbc1a98a14" containerName="mariadb-account-create-update" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.294777 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b79508-5cee-451c-8622-95bbc1a98a14" containerName="mariadb-account-create-update" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.294940 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b79508-5cee-451c-8622-95bbc1a98a14" containerName="mariadb-account-create-update" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.294950 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3337ad4-afc6-43ab-bcad-98583fbed9bf" containerName="mariadb-database-create" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.295465 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.298290 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.304628 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-45mpz"] Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.305470 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-fltjh" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.405329 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-config-data\") pod \"watcher-kuttl-db-sync-45mpz\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.405438 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8vpb\" (UniqueName: \"kubernetes.io/projected/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-kube-api-access-t8vpb\") pod \"watcher-kuttl-db-sync-45mpz\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.405473 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-45mpz\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.405507 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-db-sync-config-data\") pod \"watcher-kuttl-db-sync-45mpz\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.507059 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-45mpz\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.507124 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-db-sync-config-data\") pod \"watcher-kuttl-db-sync-45mpz\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.507218 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-config-data\") pod \"watcher-kuttl-db-sync-45mpz\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.507279 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8vpb\" (UniqueName: \"kubernetes.io/projected/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-kube-api-access-t8vpb\") pod \"watcher-kuttl-db-sync-45mpz\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.511216 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-45mpz\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.516088 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-config-data\") pod \"watcher-kuttl-db-sync-45mpz\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.516347 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-db-sync-config-data\") pod \"watcher-kuttl-db-sync-45mpz\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.526514 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8vpb\" (UniqueName: \"kubernetes.io/projected/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-kube-api-access-t8vpb\") pod \"watcher-kuttl-db-sync-45mpz\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:28 crc kubenswrapper[4778]: I1205 16:19:28.668202 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:19:29 crc kubenswrapper[4778]: I1205 16:19:29.126949 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 16:19:29 crc kubenswrapper[4778]: I1205 16:19:29.133335 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-45mpz"] Dec 05 16:19:29 crc kubenswrapper[4778]: I1205 16:19:29.786238 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" event={"ID":"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95","Type":"ContainerStarted","Data":"6f80649e7664e2c9ece948db6376914bbc24626d6412fe60dd172bcb6df8face"} Dec 05 16:19:30 crc kubenswrapper[4778]: I1205 16:19:30.795987 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8c2e058-ef3f-40d8-b946-6b1539e3491e","Type":"ContainerStarted","Data":"39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599"} Dec 05 16:19:30 crc kubenswrapper[4778]: I1205 16:19:30.798529 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:30 crc kubenswrapper[4778]: I1205 16:19:30.821810 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.891204949 podStartE2EDuration="11.821791299s" podCreationTimestamp="2025-12-05 16:19:19 +0000 UTC" firstStartedPulling="2025-12-05 16:19:21.308445399 +0000 UTC m=+1448.412241819" lastFinishedPulling="2025-12-05 16:19:30.239031789 +0000 UTC m=+1457.342828169" observedRunningTime="2025-12-05 16:19:30.816099378 +0000 UTC m=+1457.919895758" watchObservedRunningTime="2025-12-05 16:19:30.821791299 +0000 UTC m=+1457.925587679" Dec 05 16:19:33 crc kubenswrapper[4778]: I1205 16:19:33.415123 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:19:33 crc kubenswrapper[4778]: I1205 16:19:33.415454 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:19:46 crc kubenswrapper[4778]: E1205 16:19:46.842928 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Dec 05 16:19:46 crc kubenswrapper[4778]: E1205 16:19:46.843310 4778 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Dec 05 16:19:46 crc kubenswrapper[4778]: E1205 16:19:46.843443 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-kuttl-db-sync,Image:38.102.83.36:5001/podified-master-centos10/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t8vpb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-kuttl-db-sync-45mpz_watcher-kuttl-default(c3bd8a9e-4fe1-4a55-9344-4c7786c53d95): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:19:46 crc kubenswrapper[4778]: E1205 16:19:46.844626 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-kuttl-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" podUID="c3bd8a9e-4fe1-4a55-9344-4c7786c53d95" Dec 05 16:19:46 crc kubenswrapper[4778]: E1205 16:19:46.952037 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-kuttl-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/podified-master-centos10/openstack-watcher-api:watcher_latest\\\"\"" pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" podUID="c3bd8a9e-4fe1-4a55-9344-4c7786c53d95" Dec 05 16:19:50 crc kubenswrapper[4778]: I1205 16:19:50.106884 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:19:59 crc kubenswrapper[4778]: I1205 16:19:59.054996 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" event={"ID":"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95","Type":"ContainerStarted","Data":"f0823131a3b33ee75eea465cae545dbb67d8936f66dd660799b6e6b558458d2c"} Dec 05 16:19:59 crc kubenswrapper[4778]: I1205 16:19:59.077529 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" podStartSLOduration=1.884647707 podStartE2EDuration="31.077508875s" podCreationTimestamp="2025-12-05 16:19:28 +0000 UTC" firstStartedPulling="2025-12-05 16:19:29.140091391 +0000 UTC m=+1456.243887771" lastFinishedPulling="2025-12-05 16:19:58.332952549 +0000 UTC m=+1485.436748939" observedRunningTime="2025-12-05 16:19:59.077139936 +0000 UTC m=+1486.180936316" watchObservedRunningTime="2025-12-05 16:19:59.077508875 +0000 UTC m=+1486.181305255" Dec 05 16:20:03 crc kubenswrapper[4778]: I1205 16:20:03.101359 4778 generic.go:334] "Generic (PLEG): container finished" podID="c3bd8a9e-4fe1-4a55-9344-4c7786c53d95" containerID="f0823131a3b33ee75eea465cae545dbb67d8936f66dd660799b6e6b558458d2c" exitCode=0 Dec 05 16:20:03 crc kubenswrapper[4778]: I1205 16:20:03.101494 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" event={"ID":"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95","Type":"ContainerDied","Data":"f0823131a3b33ee75eea465cae545dbb67d8936f66dd660799b6e6b558458d2c"} Dec 05 16:20:03 crc kubenswrapper[4778]: I1205 16:20:03.415049 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:20:03 crc kubenswrapper[4778]: I1205 16:20:03.415148 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.494687 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.648461 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8vpb\" (UniqueName: \"kubernetes.io/projected/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-kube-api-access-t8vpb\") pod \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.648522 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-db-sync-config-data\") pod \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.648610 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-combined-ca-bundle\") pod \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.648748 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-config-data\") pod \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\" (UID: \"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95\") " Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.653853 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-kube-api-access-t8vpb" (OuterVolumeSpecName: "kube-api-access-t8vpb") pod "c3bd8a9e-4fe1-4a55-9344-4c7786c53d95" (UID: "c3bd8a9e-4fe1-4a55-9344-4c7786c53d95"). InnerVolumeSpecName "kube-api-access-t8vpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.654042 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c3bd8a9e-4fe1-4a55-9344-4c7786c53d95" (UID: "c3bd8a9e-4fe1-4a55-9344-4c7786c53d95"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.680272 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3bd8a9e-4fe1-4a55-9344-4c7786c53d95" (UID: "c3bd8a9e-4fe1-4a55-9344-4c7786c53d95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.693007 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-config-data" (OuterVolumeSpecName: "config-data") pod "c3bd8a9e-4fe1-4a55-9344-4c7786c53d95" (UID: "c3bd8a9e-4fe1-4a55-9344-4c7786c53d95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.750767 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8vpb\" (UniqueName: \"kubernetes.io/projected/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-kube-api-access-t8vpb\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.751150 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.751159 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:04 crc kubenswrapper[4778]: I1205 16:20:04.751168 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.120108 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" event={"ID":"c3bd8a9e-4fe1-4a55-9344-4c7786c53d95","Type":"ContainerDied","Data":"6f80649e7664e2c9ece948db6376914bbc24626d6412fe60dd172bcb6df8face"} Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.120143 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f80649e7664e2c9ece948db6376914bbc24626d6412fe60dd172bcb6df8face" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.120221 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-45mpz" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.481590 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:20:05 crc kubenswrapper[4778]: E1205 16:20:05.481951 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bd8a9e-4fe1-4a55-9344-4c7786c53d95" containerName="watcher-kuttl-db-sync" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.481969 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bd8a9e-4fe1-4a55-9344-4c7786c53d95" containerName="watcher-kuttl-db-sync" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.482148 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bd8a9e-4fe1-4a55-9344-4c7786c53d95" containerName="watcher-kuttl-db-sync" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.482969 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.498196 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.498232 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-fltjh" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.511737 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.541112 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.542442 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.550382 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.552470 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.565715 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a23b2048-eacf-48b8-b198-e762af53e3ec-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.565755 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.565781 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.565846 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kplqt\" (UniqueName: \"kubernetes.io/projected/6ac274f9-e27a-4b27-8635-85b34fb26a5a-kube-api-access-kplqt\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.565880 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7bv2\" (UniqueName: \"kubernetes.io/projected/a23b2048-eacf-48b8-b198-e762af53e3ec-kube-api-access-x7bv2\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.565915 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.565936 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.565952 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.565978 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac274f9-e27a-4b27-8635-85b34fb26a5a-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.565998 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.568990 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.573636 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.580194 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.597642 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667295 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ddb739-bacb-47f2-9c4f-9a70824df568-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667350 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq72f\" (UniqueName: \"kubernetes.io/projected/f9ddb739-bacb-47f2-9c4f-9a70824df568-kube-api-access-kq72f\") pod \"watcher-kuttl-applier-0\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kplqt\" (UniqueName: \"kubernetes.io/projected/6ac274f9-e27a-4b27-8635-85b34fb26a5a-kube-api-access-kplqt\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667433 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7bv2\" (UniqueName: \"kubernetes.io/projected/a23b2048-eacf-48b8-b198-e762af53e3ec-kube-api-access-x7bv2\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667456 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddb739-bacb-47f2-9c4f-9a70824df568-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667480 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddb739-bacb-47f2-9c4f-9a70824df568-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667498 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667521 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667540 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667563 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac274f9-e27a-4b27-8635-85b34fb26a5a-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667620 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a23b2048-eacf-48b8-b198-e762af53e3ec-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667637 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.667660 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.668103 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac274f9-e27a-4b27-8635-85b34fb26a5a-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.668136 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a23b2048-eacf-48b8-b198-e762af53e3ec-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.672205 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.675975 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.675985 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.683804 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.683960 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kplqt\" (UniqueName: \"kubernetes.io/projected/6ac274f9-e27a-4b27-8635-85b34fb26a5a-kube-api-access-kplqt\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.683981 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.687023 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.703255 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7bv2\" (UniqueName: \"kubernetes.io/projected/a23b2048-eacf-48b8-b198-e762af53e3ec-kube-api-access-x7bv2\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.770148 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddb739-bacb-47f2-9c4f-9a70824df568-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.770207 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddb739-bacb-47f2-9c4f-9a70824df568-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.770325 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ddb739-bacb-47f2-9c4f-9a70824df568-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.770356 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq72f\" (UniqueName: \"kubernetes.io/projected/f9ddb739-bacb-47f2-9c4f-9a70824df568-kube-api-access-kq72f\") pod \"watcher-kuttl-applier-0\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.770858 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ddb739-bacb-47f2-9c4f-9a70824df568-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.773838 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddb739-bacb-47f2-9c4f-9a70824df568-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.776242 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddb739-bacb-47f2-9c4f-9a70824df568-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.791770 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq72f\" (UniqueName: \"kubernetes.io/projected/f9ddb739-bacb-47f2-9c4f-9a70824df568-kube-api-access-kq72f\") pod \"watcher-kuttl-applier-0\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.801640 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.862152 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:05 crc kubenswrapper[4778]: I1205 16:20:05.891145 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:06 crc kubenswrapper[4778]: I1205 16:20:06.469831 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:20:06 crc kubenswrapper[4778]: I1205 16:20:06.518022 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:20:06 crc kubenswrapper[4778]: W1205 16:20:06.519872 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda23b2048_eacf_48b8_b198_e762af53e3ec.slice/crio-1738ed0d523fa8bef2775c16917bef6da6339ce47da601c511f064fad1db9b7a WatchSource:0}: Error finding container 1738ed0d523fa8bef2775c16917bef6da6339ce47da601c511f064fad1db9b7a: Status 404 returned error can't find the container with id 1738ed0d523fa8bef2775c16917bef6da6339ce47da601c511f064fad1db9b7a Dec 05 16:20:06 crc kubenswrapper[4778]: W1205 16:20:06.527641 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ac274f9_e27a_4b27_8635_85b34fb26a5a.slice/crio-7ebf4c75b9a3fc5fbf012f86e7e0905418c99233c557887a216ce20f74c3ae6d WatchSource:0}: Error finding container 7ebf4c75b9a3fc5fbf012f86e7e0905418c99233c557887a216ce20f74c3ae6d: Status 404 returned error can't find the container with id 7ebf4c75b9a3fc5fbf012f86e7e0905418c99233c557887a216ce20f74c3ae6d Dec 05 16:20:06 crc kubenswrapper[4778]: I1205 16:20:06.528706 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:20:07 crc kubenswrapper[4778]: I1205 16:20:07.148187 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"a23b2048-eacf-48b8-b198-e762af53e3ec","Type":"ContainerStarted","Data":"1738ed0d523fa8bef2775c16917bef6da6339ce47da601c511f064fad1db9b7a"} Dec 05 16:20:07 crc kubenswrapper[4778]: I1205 16:20:07.149990 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6ac274f9-e27a-4b27-8635-85b34fb26a5a","Type":"ContainerStarted","Data":"7ebf4c75b9a3fc5fbf012f86e7e0905418c99233c557887a216ce20f74c3ae6d"} Dec 05 16:20:07 crc kubenswrapper[4778]: I1205 16:20:07.151734 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"f9ddb739-bacb-47f2-9c4f-9a70824df568","Type":"ContainerStarted","Data":"c6b844df0630f22732ffcb266e0c5f30b126dff63430b686e946a1b72d1ee9b0"} Dec 05 16:20:08 crc kubenswrapper[4778]: I1205 16:20:08.172897 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6ac274f9-e27a-4b27-8635-85b34fb26a5a","Type":"ContainerStarted","Data":"9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8"} Dec 05 16:20:09 crc kubenswrapper[4778]: I1205 16:20:09.195889 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6ac274f9-e27a-4b27-8635-85b34fb26a5a","Type":"ContainerStarted","Data":"6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d"} Dec 05 16:20:09 crc kubenswrapper[4778]: I1205 16:20:09.196432 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:09 crc kubenswrapper[4778]: I1205 16:20:09.226858 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=4.226842032 podStartE2EDuration="4.226842032s" podCreationTimestamp="2025-12-05 16:20:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:20:09.224237683 +0000 UTC m=+1496.328034063" watchObservedRunningTime="2025-12-05 16:20:09.226842032 +0000 UTC m=+1496.330638412" Dec 05 16:20:10 crc kubenswrapper[4778]: I1205 16:20:10.210356 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"f9ddb739-bacb-47f2-9c4f-9a70824df568","Type":"ContainerStarted","Data":"c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa"} Dec 05 16:20:10 crc kubenswrapper[4778]: I1205 16:20:10.213975 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"a23b2048-eacf-48b8-b198-e762af53e3ec","Type":"ContainerStarted","Data":"8ad3777afd5a1a5d5f780353472b8611b285bd6739073bbc15538abd97ed6a42"} Dec 05 16:20:10 crc kubenswrapper[4778]: I1205 16:20:10.235020 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.590070729 podStartE2EDuration="5.235003335s" podCreationTimestamp="2025-12-05 16:20:05 +0000 UTC" firstStartedPulling="2025-12-05 16:20:06.477451087 +0000 UTC m=+1493.581247477" lastFinishedPulling="2025-12-05 16:20:09.122383703 +0000 UTC m=+1496.226180083" observedRunningTime="2025-12-05 16:20:10.22956294 +0000 UTC m=+1497.333359320" watchObservedRunningTime="2025-12-05 16:20:10.235003335 +0000 UTC m=+1497.338799715" Dec 05 16:20:10 crc kubenswrapper[4778]: I1205 16:20:10.251460 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.64566464 podStartE2EDuration="5.251443973s" podCreationTimestamp="2025-12-05 16:20:05 +0000 UTC" firstStartedPulling="2025-12-05 16:20:06.522524457 +0000 UTC m=+1493.626320837" lastFinishedPulling="2025-12-05 16:20:09.12830379 +0000 UTC m=+1496.232100170" observedRunningTime="2025-12-05 16:20:10.246312706 +0000 UTC m=+1497.350109086" watchObservedRunningTime="2025-12-05 16:20:10.251443973 +0000 UTC m=+1497.355240343" Dec 05 16:20:10 crc kubenswrapper[4778]: I1205 16:20:10.803208 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:10 crc kubenswrapper[4778]: I1205 16:20:10.892511 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:11 crc kubenswrapper[4778]: I1205 16:20:11.219763 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 16:20:11 crc kubenswrapper[4778]: I1205 16:20:11.615171 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:15 crc kubenswrapper[4778]: I1205 16:20:15.802326 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:15 crc kubenswrapper[4778]: I1205 16:20:15.807918 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:15 crc kubenswrapper[4778]: I1205 16:20:15.862515 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:15 crc kubenswrapper[4778]: I1205 16:20:15.892196 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:15 crc kubenswrapper[4778]: I1205 16:20:15.894986 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:15 crc kubenswrapper[4778]: I1205 16:20:15.925425 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:16 crc kubenswrapper[4778]: I1205 16:20:16.273547 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:16 crc kubenswrapper[4778]: I1205 16:20:16.281625 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:16 crc kubenswrapper[4778]: I1205 16:20:16.306328 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:16 crc kubenswrapper[4778]: I1205 16:20:16.306768 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:18 crc kubenswrapper[4778]: I1205 16:20:18.313899 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:18 crc kubenswrapper[4778]: I1205 16:20:18.314816 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="ceilometer-central-agent" containerID="cri-o://e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e" gracePeriod=30 Dec 05 16:20:18 crc kubenswrapper[4778]: I1205 16:20:18.314953 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="proxy-httpd" containerID="cri-o://39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599" gracePeriod=30 Dec 05 16:20:18 crc kubenswrapper[4778]: I1205 16:20:18.314997 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="sg-core" containerID="cri-o://c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91" gracePeriod=30 Dec 05 16:20:18 crc kubenswrapper[4778]: I1205 16:20:18.315040 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="ceilometer-notification-agent" containerID="cri-o://b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753" gracePeriod=30 Dec 05 16:20:19 crc kubenswrapper[4778]: I1205 16:20:19.311955 4778 generic.go:334] "Generic (PLEG): container finished" podID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerID="39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599" exitCode=0 Dec 05 16:20:19 crc kubenswrapper[4778]: I1205 16:20:19.311987 4778 generic.go:334] "Generic (PLEG): container finished" podID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerID="c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91" exitCode=2 Dec 05 16:20:19 crc kubenswrapper[4778]: I1205 16:20:19.311995 4778 generic.go:334] "Generic (PLEG): container finished" podID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerID="e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e" exitCode=0 Dec 05 16:20:19 crc kubenswrapper[4778]: I1205 16:20:19.312015 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8c2e058-ef3f-40d8-b946-6b1539e3491e","Type":"ContainerDied","Data":"39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599"} Dec 05 16:20:19 crc kubenswrapper[4778]: I1205 16:20:19.312041 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8c2e058-ef3f-40d8-b946-6b1539e3491e","Type":"ContainerDied","Data":"c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91"} Dec 05 16:20:19 crc kubenswrapper[4778]: I1205 16:20:19.312051 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8c2e058-ef3f-40d8-b946-6b1539e3491e","Type":"ContainerDied","Data":"e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e"} Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.098859 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.125:3000/\": dial tcp 10.217.0.125:3000: connect: connection refused" Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.369910 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-45mpz"] Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.381656 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-45mpz"] Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.416274 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher2006-account-delete-7jngf"] Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.417716 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.433422 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher2006-account-delete-7jngf"] Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.466907 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.467108 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="f9ddb739-bacb-47f2-9c4f-9a70824df568" containerName="watcher-applier" containerID="cri-o://c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa" gracePeriod=30 Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.505878 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxl65\" (UniqueName: \"kubernetes.io/projected/d8fe1844-19f3-4e21-b70c-6e6d6b455809-kube-api-access-dxl65\") pod \"watcher2006-account-delete-7jngf\" (UID: \"d8fe1844-19f3-4e21-b70c-6e6d6b455809\") " pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.505934 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8fe1844-19f3-4e21-b70c-6e6d6b455809-operator-scripts\") pod \"watcher2006-account-delete-7jngf\" (UID: \"d8fe1844-19f3-4e21-b70c-6e6d6b455809\") " pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.586877 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.587175 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" containerName="watcher-kuttl-api-log" containerID="cri-o://9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8" gracePeriod=30 Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.587740 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" containerName="watcher-api" containerID="cri-o://6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d" gracePeriod=30 Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.608274 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8fe1844-19f3-4e21-b70c-6e6d6b455809-operator-scripts\") pod \"watcher2006-account-delete-7jngf\" (UID: \"d8fe1844-19f3-4e21-b70c-6e6d6b455809\") " pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.608509 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxl65\" (UniqueName: \"kubernetes.io/projected/d8fe1844-19f3-4e21-b70c-6e6d6b455809-kube-api-access-dxl65\") pod \"watcher2006-account-delete-7jngf\" (UID: \"d8fe1844-19f3-4e21-b70c-6e6d6b455809\") " pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.609785 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8fe1844-19f3-4e21-b70c-6e6d6b455809-operator-scripts\") pod \"watcher2006-account-delete-7jngf\" (UID: \"d8fe1844-19f3-4e21-b70c-6e6d6b455809\") " pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.651627 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.651852 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="a23b2048-eacf-48b8-b198-e762af53e3ec" containerName="watcher-decision-engine" containerID="cri-o://8ad3777afd5a1a5d5f780353472b8611b285bd6739073bbc15538abd97ed6a42" gracePeriod=30 Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.651973 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxl65\" (UniqueName: \"kubernetes.io/projected/d8fe1844-19f3-4e21-b70c-6e6d6b455809-kube-api-access-dxl65\") pod \"watcher2006-account-delete-7jngf\" (UID: \"d8fe1844-19f3-4e21-b70c-6e6d6b455809\") " pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" Dec 05 16:20:20 crc kubenswrapper[4778]: I1205 16:20:20.734717 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" Dec 05 16:20:20 crc kubenswrapper[4778]: E1205 16:20:20.895782 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:20:20 crc kubenswrapper[4778]: E1205 16:20:20.898007 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:20:20 crc kubenswrapper[4778]: E1205 16:20:20.899858 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:20:20 crc kubenswrapper[4778]: E1205 16:20:20.899902 4778 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="f9ddb739-bacb-47f2-9c4f-9a70824df568" containerName="watcher-applier" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.260417 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3bd8a9e-4fe1-4a55-9344-4c7786c53d95" path="/var/lib/kubelet/pods/c3bd8a9e-4fe1-4a55-9344-4c7786c53d95/volumes" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.261145 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher2006-account-delete-7jngf"] Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.355136 4778 generic.go:334] "Generic (PLEG): container finished" podID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" containerID="9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8" exitCode=143 Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.355214 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6ac274f9-e27a-4b27-8635-85b34fb26a5a","Type":"ContainerDied","Data":"9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8"} Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.355764 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.369335 4778 generic.go:334] "Generic (PLEG): container finished" podID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerID="b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753" exitCode=0 Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.369440 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8c2e058-ef3f-40d8-b946-6b1539e3491e","Type":"ContainerDied","Data":"b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753"} Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.369469 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8c2e058-ef3f-40d8-b946-6b1539e3491e","Type":"ContainerDied","Data":"a9fbae26e68ace4be17c5665a6540c018ae6143cdacff48677ed8f5596aa77ba"} Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.369486 4778 scope.go:117] "RemoveContainer" containerID="39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.372173 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" event={"ID":"d8fe1844-19f3-4e21-b70c-6e6d6b455809","Type":"ContainerStarted","Data":"ea0c7ccbf1012523f2a5ee690ce69cc1b44efc42c95fbeecf6496b12cab9725a"} Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.418481 4778 scope.go:117] "RemoveContainer" containerID="c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.454126 4778 scope.go:117] "RemoveContainer" containerID="b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.476460 4778 scope.go:117] "RemoveContainer" containerID="e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.502747 4778 scope.go:117] "RemoveContainer" containerID="39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599" Dec 05 16:20:21 crc kubenswrapper[4778]: E1205 16:20:21.504499 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599\": container with ID starting with 39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599 not found: ID does not exist" containerID="39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.504533 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599"} err="failed to get container status \"39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599\": rpc error: code = NotFound desc = could not find container \"39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599\": container with ID starting with 39f572624970be2c2b48cb737a4d27380f14677600ac68b4c0618fba3c7b9599 not found: ID does not exist" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.504556 4778 scope.go:117] "RemoveContainer" containerID="c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91" Dec 05 16:20:21 crc kubenswrapper[4778]: E1205 16:20:21.504828 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91\": container with ID starting with c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91 not found: ID does not exist" containerID="c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.504887 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91"} err="failed to get container status \"c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91\": rpc error: code = NotFound desc = could not find container \"c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91\": container with ID starting with c42bd538b7ed7a8f323fd93226260f12cea10d30bb1ac2c0ada5a03c3d263f91 not found: ID does not exist" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.504922 4778 scope.go:117] "RemoveContainer" containerID="b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753" Dec 05 16:20:21 crc kubenswrapper[4778]: E1205 16:20:21.505161 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753\": container with ID starting with b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753 not found: ID does not exist" containerID="b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.505188 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753"} err="failed to get container status \"b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753\": rpc error: code = NotFound desc = could not find container \"b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753\": container with ID starting with b83cf684d5b296224e3cb8f7789684ef5aa477060b140859c7364bf3e3242753 not found: ID does not exist" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.505203 4778 scope.go:117] "RemoveContainer" containerID="e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e" Dec 05 16:20:21 crc kubenswrapper[4778]: E1205 16:20:21.505417 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e\": container with ID starting with e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e not found: ID does not exist" containerID="e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.505450 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e"} err="failed to get container status \"e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e\": rpc error: code = NotFound desc = could not find container \"e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e\": container with ID starting with e0790f8a1bf01c4f46e2ca75e9d9dce8444c9130d847cf48056072107538f11e not found: ID does not exist" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.522141 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-sg-core-conf-yaml\") pod \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.522307 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2e058-ef3f-40d8-b946-6b1539e3491e-log-httpd\") pod \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.522343 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-config-data\") pod \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.522404 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-ceilometer-tls-certs\") pod \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.522441 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-scripts\") pod \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.522490 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-combined-ca-bundle\") pod \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.522554 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2e058-ef3f-40d8-b946-6b1539e3491e-run-httpd\") pod \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.522592 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gthv\" (UniqueName: \"kubernetes.io/projected/e8c2e058-ef3f-40d8-b946-6b1539e3491e-kube-api-access-5gthv\") pod \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\" (UID: \"e8c2e058-ef3f-40d8-b946-6b1539e3491e\") " Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.523265 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c2e058-ef3f-40d8-b946-6b1539e3491e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e8c2e058-ef3f-40d8-b946-6b1539e3491e" (UID: "e8c2e058-ef3f-40d8-b946-6b1539e3491e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.523736 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c2e058-ef3f-40d8-b946-6b1539e3491e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e8c2e058-ef3f-40d8-b946-6b1539e3491e" (UID: "e8c2e058-ef3f-40d8-b946-6b1539e3491e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.532521 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c2e058-ef3f-40d8-b946-6b1539e3491e-kube-api-access-5gthv" (OuterVolumeSpecName: "kube-api-access-5gthv") pod "e8c2e058-ef3f-40d8-b946-6b1539e3491e" (UID: "e8c2e058-ef3f-40d8-b946-6b1539e3491e"). InnerVolumeSpecName "kube-api-access-5gthv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.532558 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-scripts" (OuterVolumeSpecName: "scripts") pod "e8c2e058-ef3f-40d8-b946-6b1539e3491e" (UID: "e8c2e058-ef3f-40d8-b946-6b1539e3491e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.550228 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e8c2e058-ef3f-40d8-b946-6b1539e3491e" (UID: "e8c2e058-ef3f-40d8-b946-6b1539e3491e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.582669 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e8c2e058-ef3f-40d8-b946-6b1539e3491e" (UID: "e8c2e058-ef3f-40d8-b946-6b1539e3491e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.607121 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8c2e058-ef3f-40d8-b946-6b1539e3491e" (UID: "e8c2e058-ef3f-40d8-b946-6b1539e3491e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.619843 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-config-data" (OuterVolumeSpecName: "config-data") pod "e8c2e058-ef3f-40d8-b946-6b1539e3491e" (UID: "e8c2e058-ef3f-40d8-b946-6b1539e3491e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.624429 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2e058-ef3f-40d8-b946-6b1539e3491e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.624455 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.624464 4778 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.624477 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.624485 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.624494 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2e058-ef3f-40d8-b946-6b1539e3491e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.624502 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gthv\" (UniqueName: \"kubernetes.io/projected/e8c2e058-ef3f-40d8-b946-6b1539e3491e-kube-api-access-5gthv\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.624512 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c2e058-ef3f-40d8-b946-6b1539e3491e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.717702 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.129:9322/\": read tcp 10.217.0.2:35334->10.217.0.129:9322: read: connection reset by peer" Dec 05 16:20:21 crc kubenswrapper[4778]: I1205 16:20:21.718194 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.129:9322/\": read tcp 10.217.0.2:35330->10.217.0.129:9322: read: connection reset by peer" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.183072 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.336092 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kplqt\" (UniqueName: \"kubernetes.io/projected/6ac274f9-e27a-4b27-8635-85b34fb26a5a-kube-api-access-kplqt\") pod \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.336275 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac274f9-e27a-4b27-8635-85b34fb26a5a-logs\") pod \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.336316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-combined-ca-bundle\") pod \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.336442 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-config-data\") pod \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.336465 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-custom-prometheus-ca\") pod \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\" (UID: \"6ac274f9-e27a-4b27-8635-85b34fb26a5a\") " Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.336968 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ac274f9-e27a-4b27-8635-85b34fb26a5a-logs" (OuterVolumeSpecName: "logs") pod "6ac274f9-e27a-4b27-8635-85b34fb26a5a" (UID: "6ac274f9-e27a-4b27-8635-85b34fb26a5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.338339 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac274f9-e27a-4b27-8635-85b34fb26a5a-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.349653 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac274f9-e27a-4b27-8635-85b34fb26a5a-kube-api-access-kplqt" (OuterVolumeSpecName: "kube-api-access-kplqt") pod "6ac274f9-e27a-4b27-8635-85b34fb26a5a" (UID: "6ac274f9-e27a-4b27-8635-85b34fb26a5a"). InnerVolumeSpecName "kube-api-access-kplqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.363487 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "6ac274f9-e27a-4b27-8635-85b34fb26a5a" (UID: "6ac274f9-e27a-4b27-8635-85b34fb26a5a"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.375593 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ac274f9-e27a-4b27-8635-85b34fb26a5a" (UID: "6ac274f9-e27a-4b27-8635-85b34fb26a5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.377078 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-config-data" (OuterVolumeSpecName: "config-data") pod "6ac274f9-e27a-4b27-8635-85b34fb26a5a" (UID: "6ac274f9-e27a-4b27-8635-85b34fb26a5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.385349 4778 generic.go:334] "Generic (PLEG): container finished" podID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" containerID="6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d" exitCode=0 Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.385506 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.385688 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6ac274f9-e27a-4b27-8635-85b34fb26a5a","Type":"ContainerDied","Data":"6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d"} Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.385748 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6ac274f9-e27a-4b27-8635-85b34fb26a5a","Type":"ContainerDied","Data":"7ebf4c75b9a3fc5fbf012f86e7e0905418c99233c557887a216ce20f74c3ae6d"} Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.385769 4778 scope.go:117] "RemoveContainer" containerID="6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.387933 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.392564 4778 generic.go:334] "Generic (PLEG): container finished" podID="d8fe1844-19f3-4e21-b70c-6e6d6b455809" containerID="6035b9073a3ba4c4af735b261e3830b18b41f9149b595c5df4211feb1177d47c" exitCode=0 Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.392888 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" event={"ID":"d8fe1844-19f3-4e21-b70c-6e6d6b455809","Type":"ContainerDied","Data":"6035b9073a3ba4c4af735b261e3830b18b41f9149b595c5df4211feb1177d47c"} Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.412497 4778 scope.go:117] "RemoveContainer" containerID="9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.435264 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.436990 4778 scope.go:117] "RemoveContainer" containerID="6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d" Dec 05 16:20:22 crc kubenswrapper[4778]: E1205 16:20:22.437356 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d\": container with ID starting with 6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d not found: ID does not exist" containerID="6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.437409 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d"} err="failed to get container status \"6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d\": rpc error: code = NotFound desc = could not find container \"6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d\": container with ID starting with 6b93a0dceaddb3a128b365acc923281c32b3874da8fe3c5040376bf5ca55162d not found: ID does not exist" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.437429 4778 scope.go:117] "RemoveContainer" containerID="9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8" Dec 05 16:20:22 crc kubenswrapper[4778]: E1205 16:20:22.438166 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8\": container with ID starting with 9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8 not found: ID does not exist" containerID="9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.438186 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8"} err="failed to get container status \"9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8\": rpc error: code = NotFound desc = could not find container \"9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8\": container with ID starting with 9010be33c522b81538f538094e0d2c25e522292d0690a20b856ca3fb88c25bf8 not found: ID does not exist" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.440047 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.440065 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.440075 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6ac274f9-e27a-4b27-8635-85b34fb26a5a-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.440084 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kplqt\" (UniqueName: \"kubernetes.io/projected/6ac274f9-e27a-4b27-8635-85b34fb26a5a-kube-api-access-kplqt\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.441680 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.466990 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.475316 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.482749 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:22 crc kubenswrapper[4778]: E1205 16:20:22.483144 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="ceilometer-central-agent" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.483168 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="ceilometer-central-agent" Dec 05 16:20:22 crc kubenswrapper[4778]: E1205 16:20:22.483190 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="ceilometer-notification-agent" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.483201 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="ceilometer-notification-agent" Dec 05 16:20:22 crc kubenswrapper[4778]: E1205 16:20:22.483213 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" containerName="watcher-kuttl-api-log" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.483221 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" containerName="watcher-kuttl-api-log" Dec 05 16:20:22 crc kubenswrapper[4778]: E1205 16:20:22.483250 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="sg-core" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.483260 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="sg-core" Dec 05 16:20:22 crc kubenswrapper[4778]: E1205 16:20:22.483280 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" containerName="watcher-api" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.483288 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" containerName="watcher-api" Dec 05 16:20:22 crc kubenswrapper[4778]: E1205 16:20:22.483298 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="proxy-httpd" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.483305 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="proxy-httpd" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.483526 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="ceilometer-notification-agent" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.483559 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="sg-core" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.483600 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="proxy-httpd" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.483622 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" containerName="ceilometer-central-agent" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.483636 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" containerName="watcher-api" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.483654 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" containerName="watcher-kuttl-api-log" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.485510 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.488088 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.488837 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.488989 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.491445 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.643404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71865766-7f47-4030-8900-e2f914bc1ae4-log-httpd\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.643465 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.643501 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxbtm\" (UniqueName: \"kubernetes.io/projected/71865766-7f47-4030-8900-e2f914bc1ae4-kube-api-access-bxbtm\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.643532 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.643553 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71865766-7f47-4030-8900-e2f914bc1ae4-run-httpd\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.643591 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-config-data\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.643618 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.643635 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-scripts\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.744909 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-config-data\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.744964 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.744988 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-scripts\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.745057 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71865766-7f47-4030-8900-e2f914bc1ae4-log-httpd\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.745672 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71865766-7f47-4030-8900-e2f914bc1ae4-log-httpd\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.745797 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.745887 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxbtm\" (UniqueName: \"kubernetes.io/projected/71865766-7f47-4030-8900-e2f914bc1ae4-kube-api-access-bxbtm\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.745985 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.746026 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71865766-7f47-4030-8900-e2f914bc1ae4-run-httpd\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.746522 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71865766-7f47-4030-8900-e2f914bc1ae4-run-httpd\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.749076 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-scripts\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.749664 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.749834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.751796 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.752496 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-config-data\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.766687 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxbtm\" (UniqueName: \"kubernetes.io/projected/71865766-7f47-4030-8900-e2f914bc1ae4-kube-api-access-bxbtm\") pod \"ceilometer-0\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:22 crc kubenswrapper[4778]: I1205 16:20:22.800972 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.170687 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.259146 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac274f9-e27a-4b27-8635-85b34fb26a5a" path="/var/lib/kubelet/pods/6ac274f9-e27a-4b27-8635-85b34fb26a5a/volumes" Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.260055 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c2e058-ef3f-40d8-b946-6b1539e3491e" path="/var/lib/kubelet/pods/e8c2e058-ef3f-40d8-b946-6b1539e3491e/volumes" Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.297605 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.402182 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"71865766-7f47-4030-8900-e2f914bc1ae4","Type":"ContainerStarted","Data":"e9c1ce71d2b079bff92a5a92d3524c8eed52627753d17ce03ab708f8a9946842"} Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.754035 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.862515 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxl65\" (UniqueName: \"kubernetes.io/projected/d8fe1844-19f3-4e21-b70c-6e6d6b455809-kube-api-access-dxl65\") pod \"d8fe1844-19f3-4e21-b70c-6e6d6b455809\" (UID: \"d8fe1844-19f3-4e21-b70c-6e6d6b455809\") " Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.862657 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8fe1844-19f3-4e21-b70c-6e6d6b455809-operator-scripts\") pod \"d8fe1844-19f3-4e21-b70c-6e6d6b455809\" (UID: \"d8fe1844-19f3-4e21-b70c-6e6d6b455809\") " Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.863611 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8fe1844-19f3-4e21-b70c-6e6d6b455809-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8fe1844-19f3-4e21-b70c-6e6d6b455809" (UID: "d8fe1844-19f3-4e21-b70c-6e6d6b455809"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.867007 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8fe1844-19f3-4e21-b70c-6e6d6b455809-kube-api-access-dxl65" (OuterVolumeSpecName: "kube-api-access-dxl65") pod "d8fe1844-19f3-4e21-b70c-6e6d6b455809" (UID: "d8fe1844-19f3-4e21-b70c-6e6d6b455809"). InnerVolumeSpecName "kube-api-access-dxl65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.908415 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.968986 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8fe1844-19f3-4e21-b70c-6e6d6b455809-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:23 crc kubenswrapper[4778]: I1205 16:20:23.969048 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxl65\" (UniqueName: \"kubernetes.io/projected/d8fe1844-19f3-4e21-b70c-6e6d6b455809-kube-api-access-dxl65\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.070207 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddb739-bacb-47f2-9c4f-9a70824df568-combined-ca-bundle\") pod \"f9ddb739-bacb-47f2-9c4f-9a70824df568\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.070287 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ddb739-bacb-47f2-9c4f-9a70824df568-logs\") pod \"f9ddb739-bacb-47f2-9c4f-9a70824df568\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.070408 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq72f\" (UniqueName: \"kubernetes.io/projected/f9ddb739-bacb-47f2-9c4f-9a70824df568-kube-api-access-kq72f\") pod \"f9ddb739-bacb-47f2-9c4f-9a70824df568\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.070454 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddb739-bacb-47f2-9c4f-9a70824df568-config-data\") pod \"f9ddb739-bacb-47f2-9c4f-9a70824df568\" (UID: \"f9ddb739-bacb-47f2-9c4f-9a70824df568\") " Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.070686 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ddb739-bacb-47f2-9c4f-9a70824df568-logs" (OuterVolumeSpecName: "logs") pod "f9ddb739-bacb-47f2-9c4f-9a70824df568" (UID: "f9ddb739-bacb-47f2-9c4f-9a70824df568"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.070915 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ddb739-bacb-47f2-9c4f-9a70824df568-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.079533 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ddb739-bacb-47f2-9c4f-9a70824df568-kube-api-access-kq72f" (OuterVolumeSpecName: "kube-api-access-kq72f") pod "f9ddb739-bacb-47f2-9c4f-9a70824df568" (UID: "f9ddb739-bacb-47f2-9c4f-9a70824df568"). InnerVolumeSpecName "kube-api-access-kq72f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.097494 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ddb739-bacb-47f2-9c4f-9a70824df568-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9ddb739-bacb-47f2-9c4f-9a70824df568" (UID: "f9ddb739-bacb-47f2-9c4f-9a70824df568"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.113326 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ddb739-bacb-47f2-9c4f-9a70824df568-config-data" (OuterVolumeSpecName: "config-data") pod "f9ddb739-bacb-47f2-9c4f-9a70824df568" (UID: "f9ddb739-bacb-47f2-9c4f-9a70824df568"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.173254 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddb739-bacb-47f2-9c4f-9a70824df568-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.173308 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq72f\" (UniqueName: \"kubernetes.io/projected/f9ddb739-bacb-47f2-9c4f-9a70824df568-kube-api-access-kq72f\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.173330 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddb739-bacb-47f2-9c4f-9a70824df568-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.422100 4778 generic.go:334] "Generic (PLEG): container finished" podID="f9ddb739-bacb-47f2-9c4f-9a70824df568" containerID="c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa" exitCode=0 Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.422198 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.422200 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"f9ddb739-bacb-47f2-9c4f-9a70824df568","Type":"ContainerDied","Data":"c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa"} Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.422331 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"f9ddb739-bacb-47f2-9c4f-9a70824df568","Type":"ContainerDied","Data":"c6b844df0630f22732ffcb266e0c5f30b126dff63430b686e946a1b72d1ee9b0"} Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.422357 4778 scope.go:117] "RemoveContainer" containerID="c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.425868 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.425869 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2006-account-delete-7jngf" event={"ID":"d8fe1844-19f3-4e21-b70c-6e6d6b455809","Type":"ContainerDied","Data":"ea0c7ccbf1012523f2a5ee690ce69cc1b44efc42c95fbeecf6496b12cab9725a"} Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.425986 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea0c7ccbf1012523f2a5ee690ce69cc1b44efc42c95fbeecf6496b12cab9725a" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.433181 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"71865766-7f47-4030-8900-e2f914bc1ae4","Type":"ContainerStarted","Data":"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4"} Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.434645 4778 generic.go:334] "Generic (PLEG): container finished" podID="a23b2048-eacf-48b8-b198-e762af53e3ec" containerID="8ad3777afd5a1a5d5f780353472b8611b285bd6739073bbc15538abd97ed6a42" exitCode=0 Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.434699 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"a23b2048-eacf-48b8-b198-e762af53e3ec","Type":"ContainerDied","Data":"8ad3777afd5a1a5d5f780353472b8611b285bd6739073bbc15538abd97ed6a42"} Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.456618 4778 scope.go:117] "RemoveContainer" containerID="c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa" Dec 05 16:20:24 crc kubenswrapper[4778]: E1205 16:20:24.457761 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa\": container with ID starting with c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa not found: ID does not exist" containerID="c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.457802 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa"} err="failed to get container status \"c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa\": rpc error: code = NotFound desc = could not find container \"c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa\": container with ID starting with c0e8f7d0ca7033d8b5b5f5afb036a9cf54c9dc62945d4da695615038c32b97aa not found: ID does not exist" Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.471649 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.488652 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:20:24 crc kubenswrapper[4778]: I1205 16:20:24.918422 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.085741 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-custom-prometheus-ca\") pod \"a23b2048-eacf-48b8-b198-e762af53e3ec\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.085800 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-config-data\") pod \"a23b2048-eacf-48b8-b198-e762af53e3ec\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.085927 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7bv2\" (UniqueName: \"kubernetes.io/projected/a23b2048-eacf-48b8-b198-e762af53e3ec-kube-api-access-x7bv2\") pod \"a23b2048-eacf-48b8-b198-e762af53e3ec\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.085971 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-combined-ca-bundle\") pod \"a23b2048-eacf-48b8-b198-e762af53e3ec\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.085992 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a23b2048-eacf-48b8-b198-e762af53e3ec-logs\") pod \"a23b2048-eacf-48b8-b198-e762af53e3ec\" (UID: \"a23b2048-eacf-48b8-b198-e762af53e3ec\") " Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.086494 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a23b2048-eacf-48b8-b198-e762af53e3ec-logs" (OuterVolumeSpecName: "logs") pod "a23b2048-eacf-48b8-b198-e762af53e3ec" (UID: "a23b2048-eacf-48b8-b198-e762af53e3ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.090191 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a23b2048-eacf-48b8-b198-e762af53e3ec-kube-api-access-x7bv2" (OuterVolumeSpecName: "kube-api-access-x7bv2") pod "a23b2048-eacf-48b8-b198-e762af53e3ec" (UID: "a23b2048-eacf-48b8-b198-e762af53e3ec"). InnerVolumeSpecName "kube-api-access-x7bv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.116591 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "a23b2048-eacf-48b8-b198-e762af53e3ec" (UID: "a23b2048-eacf-48b8-b198-e762af53e3ec"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.142477 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a23b2048-eacf-48b8-b198-e762af53e3ec" (UID: "a23b2048-eacf-48b8-b198-e762af53e3ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.164064 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-config-data" (OuterVolumeSpecName: "config-data") pod "a23b2048-eacf-48b8-b198-e762af53e3ec" (UID: "a23b2048-eacf-48b8-b198-e762af53e3ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.188039 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.188071 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.188081 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7bv2\" (UniqueName: \"kubernetes.io/projected/a23b2048-eacf-48b8-b198-e762af53e3ec-kube-api-access-x7bv2\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.188091 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23b2048-eacf-48b8-b198-e762af53e3ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.188102 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a23b2048-eacf-48b8-b198-e762af53e3ec-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.259061 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ddb739-bacb-47f2-9c4f-9a70824df568" path="/var/lib/kubelet/pods/f9ddb739-bacb-47f2-9c4f-9a70824df568/volumes" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.446436 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"71865766-7f47-4030-8900-e2f914bc1ae4","Type":"ContainerStarted","Data":"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934"} Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.452458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"a23b2048-eacf-48b8-b198-e762af53e3ec","Type":"ContainerDied","Data":"1738ed0d523fa8bef2775c16917bef6da6339ce47da601c511f064fad1db9b7a"} Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.452523 4778 scope.go:117] "RemoveContainer" containerID="8ad3777afd5a1a5d5f780353472b8611b285bd6739073bbc15538abd97ed6a42" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.452525 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.469351 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-kxr4q"] Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.504327 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-kxr4q"] Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.510470 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.517775 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.525179 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher2006-account-delete-7jngf"] Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.533165 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-2006-account-create-update-f2bbc"] Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.541537 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher2006-account-delete-7jngf"] Dec 05 16:20:25 crc kubenswrapper[4778]: I1205 16:20:25.548178 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-2006-account-create-update-f2bbc"] Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.421150 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-hbv72"] Dec 05 16:20:26 crc kubenswrapper[4778]: E1205 16:20:26.421705 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ddb739-bacb-47f2-9c4f-9a70824df568" containerName="watcher-applier" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.421720 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ddb739-bacb-47f2-9c4f-9a70824df568" containerName="watcher-applier" Dec 05 16:20:26 crc kubenswrapper[4778]: E1205 16:20:26.421730 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23b2048-eacf-48b8-b198-e762af53e3ec" containerName="watcher-decision-engine" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.421736 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23b2048-eacf-48b8-b198-e762af53e3ec" containerName="watcher-decision-engine" Dec 05 16:20:26 crc kubenswrapper[4778]: E1205 16:20:26.421756 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8fe1844-19f3-4e21-b70c-6e6d6b455809" containerName="mariadb-account-delete" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.421763 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fe1844-19f3-4e21-b70c-6e6d6b455809" containerName="mariadb-account-delete" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.421898 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a23b2048-eacf-48b8-b198-e762af53e3ec" containerName="watcher-decision-engine" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.421911 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ddb739-bacb-47f2-9c4f-9a70824df568" containerName="watcher-applier" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.421936 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8fe1844-19f3-4e21-b70c-6e6d6b455809" containerName="mariadb-account-delete" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.422433 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-hbv72" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.435157 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-hbv72"] Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.453663 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2"] Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.455485 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.457525 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.466095 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"71865766-7f47-4030-8900-e2f914bc1ae4","Type":"ContainerStarted","Data":"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0"} Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.496659 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2"] Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.510359 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtlqz\" (UniqueName: \"kubernetes.io/projected/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6-kube-api-access-xtlqz\") pod \"watcher-db-create-hbv72\" (UID: \"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6\") " pod="watcher-kuttl-default/watcher-db-create-hbv72" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.510437 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6-operator-scripts\") pod \"watcher-db-create-hbv72\" (UID: \"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6\") " pod="watcher-kuttl-default/watcher-db-create-hbv72" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.612132 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc76cf1b-e970-4771-bd28-09ad798f33e5-operator-scripts\") pod \"watcher-9df0-account-create-update-qjjx2\" (UID: \"dc76cf1b-e970-4771-bd28-09ad798f33e5\") " pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.612288 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtlqz\" (UniqueName: \"kubernetes.io/projected/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6-kube-api-access-xtlqz\") pod \"watcher-db-create-hbv72\" (UID: \"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6\") " pod="watcher-kuttl-default/watcher-db-create-hbv72" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.612348 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6-operator-scripts\") pod \"watcher-db-create-hbv72\" (UID: \"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6\") " pod="watcher-kuttl-default/watcher-db-create-hbv72" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.612424 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkz7z\" (UniqueName: \"kubernetes.io/projected/dc76cf1b-e970-4771-bd28-09ad798f33e5-kube-api-access-kkz7z\") pod \"watcher-9df0-account-create-update-qjjx2\" (UID: \"dc76cf1b-e970-4771-bd28-09ad798f33e5\") " pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.613490 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6-operator-scripts\") pod \"watcher-db-create-hbv72\" (UID: \"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6\") " pod="watcher-kuttl-default/watcher-db-create-hbv72" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.646682 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtlqz\" (UniqueName: \"kubernetes.io/projected/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6-kube-api-access-xtlqz\") pod \"watcher-db-create-hbv72\" (UID: \"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6\") " pod="watcher-kuttl-default/watcher-db-create-hbv72" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.724083 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkz7z\" (UniqueName: \"kubernetes.io/projected/dc76cf1b-e970-4771-bd28-09ad798f33e5-kube-api-access-kkz7z\") pod \"watcher-9df0-account-create-update-qjjx2\" (UID: \"dc76cf1b-e970-4771-bd28-09ad798f33e5\") " pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.724341 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc76cf1b-e970-4771-bd28-09ad798f33e5-operator-scripts\") pod \"watcher-9df0-account-create-update-qjjx2\" (UID: \"dc76cf1b-e970-4771-bd28-09ad798f33e5\") " pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.725312 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc76cf1b-e970-4771-bd28-09ad798f33e5-operator-scripts\") pod \"watcher-9df0-account-create-update-qjjx2\" (UID: \"dc76cf1b-e970-4771-bd28-09ad798f33e5\") " pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.742017 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-hbv72" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.752530 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkz7z\" (UniqueName: \"kubernetes.io/projected/dc76cf1b-e970-4771-bd28-09ad798f33e5-kube-api-access-kkz7z\") pod \"watcher-9df0-account-create-update-qjjx2\" (UID: \"dc76cf1b-e970-4771-bd28-09ad798f33e5\") " pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" Dec 05 16:20:26 crc kubenswrapper[4778]: I1205 16:20:26.776568 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.232663 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2"] Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.261968 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a23b2048-eacf-48b8-b198-e762af53e3ec" path="/var/lib/kubelet/pods/a23b2048-eacf-48b8-b198-e762af53e3ec/volumes" Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.262703 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3337ad4-afc6-43ab-bcad-98583fbed9bf" path="/var/lib/kubelet/pods/a3337ad4-afc6-43ab-bcad-98583fbed9bf/volumes" Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.263279 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5b79508-5cee-451c-8622-95bbc1a98a14" path="/var/lib/kubelet/pods/a5b79508-5cee-451c-8622-95bbc1a98a14/volumes" Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.267404 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8fe1844-19f3-4e21-b70c-6e6d6b455809" path="/var/lib/kubelet/pods/d8fe1844-19f3-4e21-b70c-6e6d6b455809/volumes" Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.268012 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-hbv72"] Dec 05 16:20:27 crc kubenswrapper[4778]: W1205 16:20:27.280380 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb9e88fb_e9c7_42a6_aa64_44d84d7317f6.slice/crio-948a35e99e62b9f2c4dc4fda4a6d333bd27795c654df37919c9a3511de795a7c WatchSource:0}: Error finding container 948a35e99e62b9f2c4dc4fda4a6d333bd27795c654df37919c9a3511de795a7c: Status 404 returned error can't find the container with id 948a35e99e62b9f2c4dc4fda4a6d333bd27795c654df37919c9a3511de795a7c Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.480645 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" event={"ID":"dc76cf1b-e970-4771-bd28-09ad798f33e5","Type":"ContainerStarted","Data":"25c66e645ce069181d360caddd549661fd3cc12764529f300fa4c3f33e7f104d"} Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.480686 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" event={"ID":"dc76cf1b-e970-4771-bd28-09ad798f33e5","Type":"ContainerStarted","Data":"e0991d9a8ae801b1e7e176a99a8b7b3822ddd63c682f4ebbe699fda2471e14ef"} Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.484663 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"71865766-7f47-4030-8900-e2f914bc1ae4","Type":"ContainerStarted","Data":"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704"} Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.484849 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="ceilometer-central-agent" containerID="cri-o://a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4" gracePeriod=30 Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.485086 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.485142 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="proxy-httpd" containerID="cri-o://0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704" gracePeriod=30 Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.485213 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="sg-core" containerID="cri-o://5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0" gracePeriod=30 Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.485267 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="ceilometer-notification-agent" containerID="cri-o://cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934" gracePeriod=30 Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.489996 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-hbv72" event={"ID":"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6","Type":"ContainerStarted","Data":"8697caa8b628fa542a4f77cad65065fb6e0dcb92725f5221ba626624b12ba613"} Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.490398 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-hbv72" event={"ID":"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6","Type":"ContainerStarted","Data":"948a35e99e62b9f2c4dc4fda4a6d333bd27795c654df37919c9a3511de795a7c"} Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.506411 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" podStartSLOduration=1.5063951279999999 podStartE2EDuration="1.506395128s" podCreationTimestamp="2025-12-05 16:20:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:20:27.500555122 +0000 UTC m=+1514.604351502" watchObservedRunningTime="2025-12-05 16:20:27.506395128 +0000 UTC m=+1514.610191508" Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.520139 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db-create-hbv72" podStartSLOduration=1.520123333 podStartE2EDuration="1.520123333s" podCreationTimestamp="2025-12-05 16:20:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:20:27.518823249 +0000 UTC m=+1514.622619649" watchObservedRunningTime="2025-12-05 16:20:27.520123333 +0000 UTC m=+1514.623919713" Dec 05 16:20:27 crc kubenswrapper[4778]: I1205 16:20:27.540129 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.306711488 podStartE2EDuration="5.540111275s" podCreationTimestamp="2025-12-05 16:20:22 +0000 UTC" firstStartedPulling="2025-12-05 16:20:23.309215459 +0000 UTC m=+1510.413011839" lastFinishedPulling="2025-12-05 16:20:26.542615246 +0000 UTC m=+1513.646411626" observedRunningTime="2025-12-05 16:20:27.538812001 +0000 UTC m=+1514.642608391" watchObservedRunningTime="2025-12-05 16:20:27.540111275 +0000 UTC m=+1514.643907655" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.289713 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.451788 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-sg-core-conf-yaml\") pod \"71865766-7f47-4030-8900-e2f914bc1ae4\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.452070 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-config-data\") pod \"71865766-7f47-4030-8900-e2f914bc1ae4\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.452456 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-scripts\") pod \"71865766-7f47-4030-8900-e2f914bc1ae4\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.452565 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71865766-7f47-4030-8900-e2f914bc1ae4-run-httpd\") pod \"71865766-7f47-4030-8900-e2f914bc1ae4\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.452650 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71865766-7f47-4030-8900-e2f914bc1ae4-log-httpd\") pod \"71865766-7f47-4030-8900-e2f914bc1ae4\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.452748 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxbtm\" (UniqueName: \"kubernetes.io/projected/71865766-7f47-4030-8900-e2f914bc1ae4-kube-api-access-bxbtm\") pod \"71865766-7f47-4030-8900-e2f914bc1ae4\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.452859 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-ceilometer-tls-certs\") pod \"71865766-7f47-4030-8900-e2f914bc1ae4\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.453174 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-combined-ca-bundle\") pod \"71865766-7f47-4030-8900-e2f914bc1ae4\" (UID: \"71865766-7f47-4030-8900-e2f914bc1ae4\") " Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.452910 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71865766-7f47-4030-8900-e2f914bc1ae4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "71865766-7f47-4030-8900-e2f914bc1ae4" (UID: "71865766-7f47-4030-8900-e2f914bc1ae4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.453041 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71865766-7f47-4030-8900-e2f914bc1ae4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "71865766-7f47-4030-8900-e2f914bc1ae4" (UID: "71865766-7f47-4030-8900-e2f914bc1ae4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.454204 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71865766-7f47-4030-8900-e2f914bc1ae4-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.454266 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71865766-7f47-4030-8900-e2f914bc1ae4-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.457186 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-scripts" (OuterVolumeSpecName: "scripts") pod "71865766-7f47-4030-8900-e2f914bc1ae4" (UID: "71865766-7f47-4030-8900-e2f914bc1ae4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.460581 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71865766-7f47-4030-8900-e2f914bc1ae4-kube-api-access-bxbtm" (OuterVolumeSpecName: "kube-api-access-bxbtm") pod "71865766-7f47-4030-8900-e2f914bc1ae4" (UID: "71865766-7f47-4030-8900-e2f914bc1ae4"). InnerVolumeSpecName "kube-api-access-bxbtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.475424 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "71865766-7f47-4030-8900-e2f914bc1ae4" (UID: "71865766-7f47-4030-8900-e2f914bc1ae4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.497894 4778 generic.go:334] "Generic (PLEG): container finished" podID="fb9e88fb-e9c7-42a6-aa64-44d84d7317f6" containerID="8697caa8b628fa542a4f77cad65065fb6e0dcb92725f5221ba626624b12ba613" exitCode=0 Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.497968 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-hbv72" event={"ID":"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6","Type":"ContainerDied","Data":"8697caa8b628fa542a4f77cad65065fb6e0dcb92725f5221ba626624b12ba613"} Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.499815 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "71865766-7f47-4030-8900-e2f914bc1ae4" (UID: "71865766-7f47-4030-8900-e2f914bc1ae4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.500025 4778 generic.go:334] "Generic (PLEG): container finished" podID="dc76cf1b-e970-4771-bd28-09ad798f33e5" containerID="25c66e645ce069181d360caddd549661fd3cc12764529f300fa4c3f33e7f104d" exitCode=0 Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.500056 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" event={"ID":"dc76cf1b-e970-4771-bd28-09ad798f33e5","Type":"ContainerDied","Data":"25c66e645ce069181d360caddd549661fd3cc12764529f300fa4c3f33e7f104d"} Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.502949 4778 generic.go:334] "Generic (PLEG): container finished" podID="71865766-7f47-4030-8900-e2f914bc1ae4" containerID="0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704" exitCode=0 Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.502972 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"71865766-7f47-4030-8900-e2f914bc1ae4","Type":"ContainerDied","Data":"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704"} Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.502997 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"71865766-7f47-4030-8900-e2f914bc1ae4","Type":"ContainerDied","Data":"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0"} Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.502960 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.502979 4778 generic.go:334] "Generic (PLEG): container finished" podID="71865766-7f47-4030-8900-e2f914bc1ae4" containerID="5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0" exitCode=2 Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.503026 4778 generic.go:334] "Generic (PLEG): container finished" podID="71865766-7f47-4030-8900-e2f914bc1ae4" containerID="cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934" exitCode=0 Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.503037 4778 generic.go:334] "Generic (PLEG): container finished" podID="71865766-7f47-4030-8900-e2f914bc1ae4" containerID="a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4" exitCode=0 Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.503057 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"71865766-7f47-4030-8900-e2f914bc1ae4","Type":"ContainerDied","Data":"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934"} Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.503098 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"71865766-7f47-4030-8900-e2f914bc1ae4","Type":"ContainerDied","Data":"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4"} Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.503109 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"71865766-7f47-4030-8900-e2f914bc1ae4","Type":"ContainerDied","Data":"e9c1ce71d2b079bff92a5a92d3524c8eed52627753d17ce03ab708f8a9946842"} Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.503127 4778 scope.go:117] "RemoveContainer" containerID="0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.529808 4778 scope.go:117] "RemoveContainer" containerID="5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.536506 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71865766-7f47-4030-8900-e2f914bc1ae4" (UID: "71865766-7f47-4030-8900-e2f914bc1ae4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.554178 4778 scope.go:117] "RemoveContainer" containerID="cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.556522 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.556556 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.556571 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxbtm\" (UniqueName: \"kubernetes.io/projected/71865766-7f47-4030-8900-e2f914bc1ae4-kube-api-access-bxbtm\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.556586 4778 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.556625 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.556799 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-config-data" (OuterVolumeSpecName: "config-data") pod "71865766-7f47-4030-8900-e2f914bc1ae4" (UID: "71865766-7f47-4030-8900-e2f914bc1ae4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.571054 4778 scope.go:117] "RemoveContainer" containerID="a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.587752 4778 scope.go:117] "RemoveContainer" containerID="0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704" Dec 05 16:20:28 crc kubenswrapper[4778]: E1205 16:20:28.588154 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704\": container with ID starting with 0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704 not found: ID does not exist" containerID="0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.588194 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704"} err="failed to get container status \"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704\": rpc error: code = NotFound desc = could not find container \"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704\": container with ID starting with 0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.588222 4778 scope.go:117] "RemoveContainer" containerID="5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0" Dec 05 16:20:28 crc kubenswrapper[4778]: E1205 16:20:28.588525 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0\": container with ID starting with 5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0 not found: ID does not exist" containerID="5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.588564 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0"} err="failed to get container status \"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0\": rpc error: code = NotFound desc = could not find container \"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0\": container with ID starting with 5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.588579 4778 scope.go:117] "RemoveContainer" containerID="cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934" Dec 05 16:20:28 crc kubenswrapper[4778]: E1205 16:20:28.588902 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934\": container with ID starting with cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934 not found: ID does not exist" containerID="cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.588923 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934"} err="failed to get container status \"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934\": rpc error: code = NotFound desc = could not find container \"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934\": container with ID starting with cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.588938 4778 scope.go:117] "RemoveContainer" containerID="a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4" Dec 05 16:20:28 crc kubenswrapper[4778]: E1205 16:20:28.589144 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4\": container with ID starting with a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4 not found: ID does not exist" containerID="a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.589164 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4"} err="failed to get container status \"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4\": rpc error: code = NotFound desc = could not find container \"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4\": container with ID starting with a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.589186 4778 scope.go:117] "RemoveContainer" containerID="0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.589546 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704"} err="failed to get container status \"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704\": rpc error: code = NotFound desc = could not find container \"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704\": container with ID starting with 0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.589564 4778 scope.go:117] "RemoveContainer" containerID="5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.589785 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0"} err="failed to get container status \"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0\": rpc error: code = NotFound desc = could not find container \"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0\": container with ID starting with 5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.589807 4778 scope.go:117] "RemoveContainer" containerID="cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.590012 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934"} err="failed to get container status \"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934\": rpc error: code = NotFound desc = could not find container \"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934\": container with ID starting with cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.590040 4778 scope.go:117] "RemoveContainer" containerID="a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.590336 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4"} err="failed to get container status \"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4\": rpc error: code = NotFound desc = could not find container \"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4\": container with ID starting with a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.590353 4778 scope.go:117] "RemoveContainer" containerID="0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.590762 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704"} err="failed to get container status \"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704\": rpc error: code = NotFound desc = could not find container \"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704\": container with ID starting with 0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.590787 4778 scope.go:117] "RemoveContainer" containerID="5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.591117 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0"} err="failed to get container status \"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0\": rpc error: code = NotFound desc = could not find container \"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0\": container with ID starting with 5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.591135 4778 scope.go:117] "RemoveContainer" containerID="cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.591395 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934"} err="failed to get container status \"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934\": rpc error: code = NotFound desc = could not find container \"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934\": container with ID starting with cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.591413 4778 scope.go:117] "RemoveContainer" containerID="a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.591710 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4"} err="failed to get container status \"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4\": rpc error: code = NotFound desc = could not find container \"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4\": container with ID starting with a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.591726 4778 scope.go:117] "RemoveContainer" containerID="0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.591969 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704"} err="failed to get container status \"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704\": rpc error: code = NotFound desc = could not find container \"0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704\": container with ID starting with 0bf0fbf3aff219ccc0660710f35a4626ce613aeba7915fbaec813dfb28ec1704 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.591988 4778 scope.go:117] "RemoveContainer" containerID="5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.592306 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0"} err="failed to get container status \"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0\": rpc error: code = NotFound desc = could not find container \"5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0\": container with ID starting with 5deea40e053a1e7c1faf65826df3d789084e5fddc7d234a7d0593e19a76d5bd0 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.592324 4778 scope.go:117] "RemoveContainer" containerID="cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.592634 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934"} err="failed to get container status \"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934\": rpc error: code = NotFound desc = could not find container \"cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934\": container with ID starting with cd7d9445cf01e4aec39112f4548393239b39b94d93432aae88aa1e246baa6934 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.592651 4778 scope.go:117] "RemoveContainer" containerID="a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.592859 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4"} err="failed to get container status \"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4\": rpc error: code = NotFound desc = could not find container \"a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4\": container with ID starting with a215541640d7510dd8c736855e3a065a96a5abfc773e6e5b2419a66eaee012a4 not found: ID does not exist" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.658757 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71865766-7f47-4030-8900-e2f914bc1ae4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.872681 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.881935 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.907498 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:28 crc kubenswrapper[4778]: E1205 16:20:28.907883 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="sg-core" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.907897 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="sg-core" Dec 05 16:20:28 crc kubenswrapper[4778]: E1205 16:20:28.907908 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="proxy-httpd" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.907916 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="proxy-httpd" Dec 05 16:20:28 crc kubenswrapper[4778]: E1205 16:20:28.907946 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="ceilometer-central-agent" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.907953 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="ceilometer-central-agent" Dec 05 16:20:28 crc kubenswrapper[4778]: E1205 16:20:28.907964 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="ceilometer-notification-agent" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.907970 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="ceilometer-notification-agent" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.908112 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="proxy-httpd" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.908126 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="ceilometer-central-agent" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.908133 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="sg-core" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.908143 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" containerName="ceilometer-notification-agent" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.909927 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.912075 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.912233 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.916628 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 16:20:28 crc kubenswrapper[4778]: I1205 16:20:28.926641 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.063768 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-log-httpd\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.063865 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.063905 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.063942 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.063964 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-scripts\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.063982 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq8dj\" (UniqueName: \"kubernetes.io/projected/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-kube-api-access-cq8dj\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.064001 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-run-httpd\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.064033 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-config-data\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.164905 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.165194 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-scripts\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.165215 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq8dj\" (UniqueName: \"kubernetes.io/projected/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-kube-api-access-cq8dj\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.165231 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-run-httpd\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.165259 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-config-data\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.165297 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-log-httpd\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.165342 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.165389 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.166016 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-run-httpd\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.166156 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-log-httpd\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.171033 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-config-data\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.171695 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.172893 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-scripts\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.174263 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.180222 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.187608 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq8dj\" (UniqueName: \"kubernetes.io/projected/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-kube-api-access-cq8dj\") pod \"ceilometer-0\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.232088 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.264149 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71865766-7f47-4030-8900-e2f914bc1ae4" path="/var/lib/kubelet/pods/71865766-7f47-4030-8900-e2f914bc1ae4/volumes" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.652543 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:29 crc kubenswrapper[4778]: W1205 16:20:29.674477 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8b241d2_6ccb_4b8a_96f6_2dc5d4e66239.slice/crio-b9d71dda09589b57491e5fb7c551547a0ee5ef2e298eece276f92633e6e8f30f WatchSource:0}: Error finding container b9d71dda09589b57491e5fb7c551547a0ee5ef2e298eece276f92633e6e8f30f: Status 404 returned error can't find the container with id b9d71dda09589b57491e5fb7c551547a0ee5ef2e298eece276f92633e6e8f30f Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.861911 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-hbv72" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.940462 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.985689 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6-operator-scripts\") pod \"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6\" (UID: \"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6\") " Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.985753 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtlqz\" (UniqueName: \"kubernetes.io/projected/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6-kube-api-access-xtlqz\") pod \"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6\" (UID: \"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6\") " Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.989096 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb9e88fb-e9c7-42a6-aa64-44d84d7317f6" (UID: "fb9e88fb-e9c7-42a6-aa64-44d84d7317f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:29 crc kubenswrapper[4778]: I1205 16:20:29.990926 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6-kube-api-access-xtlqz" (OuterVolumeSpecName: "kube-api-access-xtlqz") pod "fb9e88fb-e9c7-42a6-aa64-44d84d7317f6" (UID: "fb9e88fb-e9c7-42a6-aa64-44d84d7317f6"). InnerVolumeSpecName "kube-api-access-xtlqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.087497 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkz7z\" (UniqueName: \"kubernetes.io/projected/dc76cf1b-e970-4771-bd28-09ad798f33e5-kube-api-access-kkz7z\") pod \"dc76cf1b-e970-4771-bd28-09ad798f33e5\" (UID: \"dc76cf1b-e970-4771-bd28-09ad798f33e5\") " Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.087574 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc76cf1b-e970-4771-bd28-09ad798f33e5-operator-scripts\") pod \"dc76cf1b-e970-4771-bd28-09ad798f33e5\" (UID: \"dc76cf1b-e970-4771-bd28-09ad798f33e5\") " Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.088145 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.088173 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtlqz\" (UniqueName: \"kubernetes.io/projected/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6-kube-api-access-xtlqz\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.089091 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc76cf1b-e970-4771-bd28-09ad798f33e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc76cf1b-e970-4771-bd28-09ad798f33e5" (UID: "dc76cf1b-e970-4771-bd28-09ad798f33e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.093245 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc76cf1b-e970-4771-bd28-09ad798f33e5-kube-api-access-kkz7z" (OuterVolumeSpecName: "kube-api-access-kkz7z") pod "dc76cf1b-e970-4771-bd28-09ad798f33e5" (UID: "dc76cf1b-e970-4771-bd28-09ad798f33e5"). InnerVolumeSpecName "kube-api-access-kkz7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.189511 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkz7z\" (UniqueName: \"kubernetes.io/projected/dc76cf1b-e970-4771-bd28-09ad798f33e5-kube-api-access-kkz7z\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.189552 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc76cf1b-e970-4771-bd28-09ad798f33e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.526132 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-hbv72" Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.526146 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-hbv72" event={"ID":"fb9e88fb-e9c7-42a6-aa64-44d84d7317f6","Type":"ContainerDied","Data":"948a35e99e62b9f2c4dc4fda4a6d333bd27795c654df37919c9a3511de795a7c"} Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.526656 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="948a35e99e62b9f2c4dc4fda4a6d333bd27795c654df37919c9a3511de795a7c" Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.528579 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239","Type":"ContainerStarted","Data":"2070a19c563e43128aa425f8bf3e1fd5b285bf3616e96a1fa4de19e3fae118c5"} Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.528620 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239","Type":"ContainerStarted","Data":"b9d71dda09589b57491e5fb7c551547a0ee5ef2e298eece276f92633e6e8f30f"} Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.530263 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" event={"ID":"dc76cf1b-e970-4771-bd28-09ad798f33e5","Type":"ContainerDied","Data":"e0991d9a8ae801b1e7e176a99a8b7b3822ddd63c682f4ebbe699fda2471e14ef"} Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.530296 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0991d9a8ae801b1e7e176a99a8b7b3822ddd63c682f4ebbe699fda2471e14ef" Dec 05 16:20:30 crc kubenswrapper[4778]: I1205 16:20:30.530349 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.544068 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239","Type":"ContainerStarted","Data":"c4e68fcc28ecfac35293b20d98718f8d9bd5887827f8d05943e69a3c86982cca"} Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.677763 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw"] Dec 05 16:20:31 crc kubenswrapper[4778]: E1205 16:20:31.678394 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9e88fb-e9c7-42a6-aa64-44d84d7317f6" containerName="mariadb-database-create" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.678488 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9e88fb-e9c7-42a6-aa64-44d84d7317f6" containerName="mariadb-database-create" Dec 05 16:20:31 crc kubenswrapper[4778]: E1205 16:20:31.678577 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc76cf1b-e970-4771-bd28-09ad798f33e5" containerName="mariadb-account-create-update" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.678666 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc76cf1b-e970-4771-bd28-09ad798f33e5" containerName="mariadb-account-create-update" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.678971 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc76cf1b-e970-4771-bd28-09ad798f33e5" containerName="mariadb-account-create-update" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.679072 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9e88fb-e9c7-42a6-aa64-44d84d7317f6" containerName="mariadb-database-create" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.679829 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.681833 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-gvrw6" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.682280 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.686233 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw"] Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.811956 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-z6bpw\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.812010 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmfcg\" (UniqueName: \"kubernetes.io/projected/f25824f8-0ff4-400a-b58f-54918f20860e-kube-api-access-cmfcg\") pod \"watcher-kuttl-db-sync-z6bpw\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.812447 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-config-data\") pod \"watcher-kuttl-db-sync-z6bpw\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.812512 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-db-sync-config-data\") pod \"watcher-kuttl-db-sync-z6bpw\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.914110 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-config-data\") pod \"watcher-kuttl-db-sync-z6bpw\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.914169 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-db-sync-config-data\") pod \"watcher-kuttl-db-sync-z6bpw\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.914252 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-z6bpw\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.914294 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmfcg\" (UniqueName: \"kubernetes.io/projected/f25824f8-0ff4-400a-b58f-54918f20860e-kube-api-access-cmfcg\") pod \"watcher-kuttl-db-sync-z6bpw\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.918569 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-db-sync-config-data\") pod \"watcher-kuttl-db-sync-z6bpw\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.918882 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-z6bpw\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.919420 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-config-data\") pod \"watcher-kuttl-db-sync-z6bpw\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.945864 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmfcg\" (UniqueName: \"kubernetes.io/projected/f25824f8-0ff4-400a-b58f-54918f20860e-kube-api-access-cmfcg\") pod \"watcher-kuttl-db-sync-z6bpw\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:31 crc kubenswrapper[4778]: I1205 16:20:31.994075 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:32 crc kubenswrapper[4778]: I1205 16:20:32.504277 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw"] Dec 05 16:20:32 crc kubenswrapper[4778]: W1205 16:20:32.508477 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf25824f8_0ff4_400a_b58f_54918f20860e.slice/crio-511c1dc0e28e6e1a2e9c164a22146994de59bd8ae07cb7a293fa3927045291b4 WatchSource:0}: Error finding container 511c1dc0e28e6e1a2e9c164a22146994de59bd8ae07cb7a293fa3927045291b4: Status 404 returned error can't find the container with id 511c1dc0e28e6e1a2e9c164a22146994de59bd8ae07cb7a293fa3927045291b4 Dec 05 16:20:32 crc kubenswrapper[4778]: I1205 16:20:32.564123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" event={"ID":"f25824f8-0ff4-400a-b58f-54918f20860e","Type":"ContainerStarted","Data":"511c1dc0e28e6e1a2e9c164a22146994de59bd8ae07cb7a293fa3927045291b4"} Dec 05 16:20:32 crc kubenswrapper[4778]: I1205 16:20:32.567639 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239","Type":"ContainerStarted","Data":"e87c47477d5c78ebdbdc2aa39419c65f4b2322c0daed314d55cbfd23f7ccec6a"} Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.414144 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.414425 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.414464 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.415037 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.415085 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" gracePeriod=600 Dec 05 16:20:33 crc kubenswrapper[4778]: E1205 16:20:33.538810 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.579531 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" exitCode=0 Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.579603 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0"} Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.579677 4778 scope.go:117] "RemoveContainer" containerID="9838a46c7fca5484e5528acba6a6dc7600262ec3d0517e19089e823847361767" Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.580526 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:20:33 crc kubenswrapper[4778]: E1205 16:20:33.580828 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.583059 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239","Type":"ContainerStarted","Data":"f9a0f26a5a8ad608f4a13fa73fe9d3cc38dfd7674f1e8fbbb73e10f94e88c416"} Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.583249 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.585034 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" event={"ID":"f25824f8-0ff4-400a-b58f-54918f20860e","Type":"ContainerStarted","Data":"63751f32c9e71d97bf22cce360c68eaf72a1a6f2ef0671e11fd2d3827ffb489b"} Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.633073 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" podStartSLOduration=2.63304881 podStartE2EDuration="2.63304881s" podCreationTimestamp="2025-12-05 16:20:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:20:33.621339979 +0000 UTC m=+1520.725136369" watchObservedRunningTime="2025-12-05 16:20:33.63304881 +0000 UTC m=+1520.736845200" Dec 05 16:20:33 crc kubenswrapper[4778]: I1205 16:20:33.662707 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.576636903 podStartE2EDuration="5.662689609s" podCreationTimestamp="2025-12-05 16:20:28 +0000 UTC" firstStartedPulling="2025-12-05 16:20:29.677528753 +0000 UTC m=+1516.781325133" lastFinishedPulling="2025-12-05 16:20:32.763581459 +0000 UTC m=+1519.867377839" observedRunningTime="2025-12-05 16:20:33.650619529 +0000 UTC m=+1520.754415919" watchObservedRunningTime="2025-12-05 16:20:33.662689609 +0000 UTC m=+1520.766485989" Dec 05 16:20:35 crc kubenswrapper[4778]: I1205 16:20:35.606329 4778 generic.go:334] "Generic (PLEG): container finished" podID="f25824f8-0ff4-400a-b58f-54918f20860e" containerID="63751f32c9e71d97bf22cce360c68eaf72a1a6f2ef0671e11fd2d3827ffb489b" exitCode=0 Dec 05 16:20:35 crc kubenswrapper[4778]: I1205 16:20:35.606404 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" event={"ID":"f25824f8-0ff4-400a-b58f-54918f20860e","Type":"ContainerDied","Data":"63751f32c9e71d97bf22cce360c68eaf72a1a6f2ef0671e11fd2d3827ffb489b"} Dec 05 16:20:36 crc kubenswrapper[4778]: I1205 16:20:36.997150 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.129996 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-config-data\") pod \"f25824f8-0ff4-400a-b58f-54918f20860e\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.130065 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-combined-ca-bundle\") pod \"f25824f8-0ff4-400a-b58f-54918f20860e\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.130098 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmfcg\" (UniqueName: \"kubernetes.io/projected/f25824f8-0ff4-400a-b58f-54918f20860e-kube-api-access-cmfcg\") pod \"f25824f8-0ff4-400a-b58f-54918f20860e\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.130241 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-db-sync-config-data\") pod \"f25824f8-0ff4-400a-b58f-54918f20860e\" (UID: \"f25824f8-0ff4-400a-b58f-54918f20860e\") " Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.134749 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25824f8-0ff4-400a-b58f-54918f20860e-kube-api-access-cmfcg" (OuterVolumeSpecName: "kube-api-access-cmfcg") pod "f25824f8-0ff4-400a-b58f-54918f20860e" (UID: "f25824f8-0ff4-400a-b58f-54918f20860e"). InnerVolumeSpecName "kube-api-access-cmfcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.149627 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f25824f8-0ff4-400a-b58f-54918f20860e" (UID: "f25824f8-0ff4-400a-b58f-54918f20860e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.177052 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f25824f8-0ff4-400a-b58f-54918f20860e" (UID: "f25824f8-0ff4-400a-b58f-54918f20860e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.204590 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-config-data" (OuterVolumeSpecName: "config-data") pod "f25824f8-0ff4-400a-b58f-54918f20860e" (UID: "f25824f8-0ff4-400a-b58f-54918f20860e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.236215 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.236269 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.236286 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmfcg\" (UniqueName: \"kubernetes.io/projected/f25824f8-0ff4-400a-b58f-54918f20860e-kube-api-access-cmfcg\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.236298 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f25824f8-0ff4-400a-b58f-54918f20860e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:37 crc kubenswrapper[4778]: E1205 16:20:37.404415 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf25824f8_0ff4_400a_b58f_54918f20860e.slice\": RecentStats: unable to find data in memory cache]" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.628125 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" event={"ID":"f25824f8-0ff4-400a-b58f-54918f20860e","Type":"ContainerDied","Data":"511c1dc0e28e6e1a2e9c164a22146994de59bd8ae07cb7a293fa3927045291b4"} Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.628509 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="511c1dc0e28e6e1a2e9c164a22146994de59bd8ae07cb7a293fa3927045291b4" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.628226 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.940980 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:20:37 crc kubenswrapper[4778]: E1205 16:20:37.941321 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25824f8-0ff4-400a-b58f-54918f20860e" containerName="watcher-kuttl-db-sync" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.941338 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25824f8-0ff4-400a-b58f-54918f20860e" containerName="watcher-kuttl-db-sync" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.941505 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25824f8-0ff4-400a-b58f-54918f20860e" containerName="watcher-kuttl-db-sync" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.942041 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.945034 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-gvrw6" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.945332 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.961747 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.969858 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.971229 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.975415 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 16:20:37 crc kubenswrapper[4778]: I1205 16:20:37.990013 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.050002 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae6197d-57c4-4329-a982-2a458d366dcc-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.050040 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.050062 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.050168 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.050216 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqncz\" (UniqueName: \"kubernetes.io/projected/fae6197d-57c4-4329-a982-2a458d366dcc-kube-api-access-xqncz\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.056076 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.061012 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.077599 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.080846 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.151449 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqncz\" (UniqueName: \"kubernetes.io/projected/fae6197d-57c4-4329-a982-2a458d366dcc-kube-api-access-xqncz\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.151693 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.151808 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae6197d-57c4-4329-a982-2a458d366dcc-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.151932 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.152010 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.152144 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.152212 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7001d061-fad8-4b0b-b9ee-fa1eac930efd-logs\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.152290 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.152391 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.152479 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqpj2\" (UniqueName: \"kubernetes.io/projected/7001d061-fad8-4b0b-b9ee-fa1eac930efd-kube-api-access-bqpj2\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.152585 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae6197d-57c4-4329-a982-2a458d366dcc-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.158222 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.168682 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqncz\" (UniqueName: \"kubernetes.io/projected/fae6197d-57c4-4329-a982-2a458d366dcc-kube-api-access-xqncz\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.174832 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.178019 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.253471 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvs8\" (UniqueName: \"kubernetes.io/projected/0536ee06-b83a-4947-9b31-40308c6ccb7a-kube-api-access-5lvs8\") pod \"watcher-kuttl-applier-0\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.254014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.254115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0536ee06-b83a-4947-9b31-40308c6ccb7a-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.254233 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0536ee06-b83a-4947-9b31-40308c6ccb7a-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.254343 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.254434 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7001d061-fad8-4b0b-b9ee-fa1eac930efd-logs\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.254509 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.254590 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0536ee06-b83a-4947-9b31-40308c6ccb7a-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.254666 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqpj2\" (UniqueName: \"kubernetes.io/projected/7001d061-fad8-4b0b-b9ee-fa1eac930efd-kube-api-access-bqpj2\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.255123 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7001d061-fad8-4b0b-b9ee-fa1eac930efd-logs\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.263961 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.264062 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.264208 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.289505 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqpj2\" (UniqueName: \"kubernetes.io/projected/7001d061-fad8-4b0b-b9ee-fa1eac930efd-kube-api-access-bqpj2\") pod \"watcher-kuttl-api-0\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.299261 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.313764 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.355621 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0536ee06-b83a-4947-9b31-40308c6ccb7a-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.355687 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvs8\" (UniqueName: \"kubernetes.io/projected/0536ee06-b83a-4947-9b31-40308c6ccb7a-kube-api-access-5lvs8\") pod \"watcher-kuttl-applier-0\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.355712 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0536ee06-b83a-4947-9b31-40308c6ccb7a-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.355800 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0536ee06-b83a-4947-9b31-40308c6ccb7a-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.356248 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0536ee06-b83a-4947-9b31-40308c6ccb7a-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.360543 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0536ee06-b83a-4947-9b31-40308c6ccb7a-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.363095 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0536ee06-b83a-4947-9b31-40308c6ccb7a-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.372638 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvs8\" (UniqueName: \"kubernetes.io/projected/0536ee06-b83a-4947-9b31-40308c6ccb7a-kube-api-access-5lvs8\") pod \"watcher-kuttl-applier-0\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.390331 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.788923 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.796171 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:20:38 crc kubenswrapper[4778]: I1205 16:20:38.917985 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:20:38 crc kubenswrapper[4778]: W1205 16:20:38.919457 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0536ee06_b83a_4947_9b31_40308c6ccb7a.slice/crio-db18787261dfd6630a7fb7b4530c861738fc6649d84761d34b5677ad001eab1a WatchSource:0}: Error finding container db18787261dfd6630a7fb7b4530c861738fc6649d84761d34b5677ad001eab1a: Status 404 returned error can't find the container with id db18787261dfd6630a7fb7b4530c861738fc6649d84761d34b5677ad001eab1a Dec 05 16:20:39 crc kubenswrapper[4778]: I1205 16:20:39.649151 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7001d061-fad8-4b0b-b9ee-fa1eac930efd","Type":"ContainerStarted","Data":"4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3"} Dec 05 16:20:39 crc kubenswrapper[4778]: I1205 16:20:39.650538 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7001d061-fad8-4b0b-b9ee-fa1eac930efd","Type":"ContainerStarted","Data":"96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef"} Dec 05 16:20:39 crc kubenswrapper[4778]: I1205 16:20:39.650632 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:39 crc kubenswrapper[4778]: I1205 16:20:39.650705 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7001d061-fad8-4b0b-b9ee-fa1eac930efd","Type":"ContainerStarted","Data":"2fda272f096317bee895d4dd39433568b34c1951bfff23a17b3fb1ceb2733d82"} Dec 05 16:20:39 crc kubenswrapper[4778]: I1205 16:20:39.651013 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"0536ee06-b83a-4947-9b31-40308c6ccb7a","Type":"ContainerStarted","Data":"daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673"} Dec 05 16:20:39 crc kubenswrapper[4778]: I1205 16:20:39.651044 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"0536ee06-b83a-4947-9b31-40308c6ccb7a","Type":"ContainerStarted","Data":"db18787261dfd6630a7fb7b4530c861738fc6649d84761d34b5677ad001eab1a"} Dec 05 16:20:39 crc kubenswrapper[4778]: I1205 16:20:39.652552 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"fae6197d-57c4-4329-a982-2a458d366dcc","Type":"ContainerStarted","Data":"6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a"} Dec 05 16:20:39 crc kubenswrapper[4778]: I1205 16:20:39.652674 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"fae6197d-57c4-4329-a982-2a458d366dcc","Type":"ContainerStarted","Data":"2894af30b1451093add8e2ba124242ebbe4ced9d6446a327bb9e48eab5bd9b77"} Dec 05 16:20:39 crc kubenswrapper[4778]: I1205 16:20:39.707116 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.707097823 podStartE2EDuration="2.707097823s" podCreationTimestamp="2025-12-05 16:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:20:39.675699867 +0000 UTC m=+1526.779496257" watchObservedRunningTime="2025-12-05 16:20:39.707097823 +0000 UTC m=+1526.810894203" Dec 05 16:20:39 crc kubenswrapper[4778]: I1205 16:20:39.712294 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.71227664 podStartE2EDuration="2.71227664s" podCreationTimestamp="2025-12-05 16:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:20:39.70362562 +0000 UTC m=+1526.807422000" watchObservedRunningTime="2025-12-05 16:20:39.71227664 +0000 UTC m=+1526.816073020" Dec 05 16:20:39 crc kubenswrapper[4778]: I1205 16:20:39.742520 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.742497165 podStartE2EDuration="1.742497165s" podCreationTimestamp="2025-12-05 16:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:20:39.736001402 +0000 UTC m=+1526.839797782" watchObservedRunningTime="2025-12-05 16:20:39.742497165 +0000 UTC m=+1526.846293545" Dec 05 16:20:41 crc kubenswrapper[4778]: I1205 16:20:41.670414 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 16:20:42 crc kubenswrapper[4778]: I1205 16:20:42.036130 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:43 crc kubenswrapper[4778]: I1205 16:20:43.314567 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:43 crc kubenswrapper[4778]: I1205 16:20:43.391094 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:46 crc kubenswrapper[4778]: I1205 16:20:46.249646 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:20:46 crc kubenswrapper[4778]: E1205 16:20:46.250171 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:20:48 crc kubenswrapper[4778]: I1205 16:20:48.300224 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:48 crc kubenswrapper[4778]: I1205 16:20:48.315046 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:48 crc kubenswrapper[4778]: I1205 16:20:48.322167 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:48 crc kubenswrapper[4778]: I1205 16:20:48.327478 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:48 crc kubenswrapper[4778]: I1205 16:20:48.391199 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:48 crc kubenswrapper[4778]: I1205 16:20:48.421961 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:48 crc kubenswrapper[4778]: I1205 16:20:48.985071 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:48 crc kubenswrapper[4778]: I1205 16:20:48.990683 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:49 crc kubenswrapper[4778]: I1205 16:20:49.018285 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:49 crc kubenswrapper[4778]: I1205 16:20:49.018446 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.059430 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw"] Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.083381 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-z6bpw"] Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.087079 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher9df0-account-delete-25tk7"] Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.088082 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.104434 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher9df0-account-delete-25tk7"] Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.164005 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.186727 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpz78\" (UniqueName: \"kubernetes.io/projected/47c177ae-4cee-447f-a426-18ddcb4af8e7-kube-api-access-cpz78\") pod \"watcher9df0-account-delete-25tk7\" (UID: \"47c177ae-4cee-447f-a426-18ddcb4af8e7\") " pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.186862 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47c177ae-4cee-447f-a426-18ddcb4af8e7-operator-scripts\") pod \"watcher9df0-account-delete-25tk7\" (UID: \"47c177ae-4cee-447f-a426-18ddcb4af8e7\") " pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.260079 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25824f8-0ff4-400a-b58f-54918f20860e" path="/var/lib/kubelet/pods/f25824f8-0ff4-400a-b58f-54918f20860e/volumes" Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.279857 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.280070 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="7001d061-fad8-4b0b-b9ee-fa1eac930efd" containerName="watcher-kuttl-api-log" containerID="cri-o://96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef" gracePeriod=30 Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.280447 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="7001d061-fad8-4b0b-b9ee-fa1eac930efd" containerName="watcher-api" containerID="cri-o://4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3" gracePeriod=30 Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.287957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpz78\" (UniqueName: \"kubernetes.io/projected/47c177ae-4cee-447f-a426-18ddcb4af8e7-kube-api-access-cpz78\") pod \"watcher9df0-account-delete-25tk7\" (UID: \"47c177ae-4cee-447f-a426-18ddcb4af8e7\") " pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.288053 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47c177ae-4cee-447f-a426-18ddcb4af8e7-operator-scripts\") pod \"watcher9df0-account-delete-25tk7\" (UID: \"47c177ae-4cee-447f-a426-18ddcb4af8e7\") " pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.288749 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47c177ae-4cee-447f-a426-18ddcb4af8e7-operator-scripts\") pod \"watcher9df0-account-delete-25tk7\" (UID: \"47c177ae-4cee-447f-a426-18ddcb4af8e7\") " pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.291312 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.291634 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="0536ee06-b83a-4947-9b31-40308c6ccb7a" containerName="watcher-applier" containerID="cri-o://daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673" gracePeriod=30 Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.320945 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpz78\" (UniqueName: \"kubernetes.io/projected/47c177ae-4cee-447f-a426-18ddcb4af8e7-kube-api-access-cpz78\") pod \"watcher9df0-account-delete-25tk7\" (UID: \"47c177ae-4cee-447f-a426-18ddcb4af8e7\") " pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.419129 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.786961 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.791711 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="ceilometer-central-agent" containerID="cri-o://2070a19c563e43128aa425f8bf3e1fd5b285bf3616e96a1fa4de19e3fae118c5" gracePeriod=30 Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.792451 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="proxy-httpd" containerID="cri-o://f9a0f26a5a8ad608f4a13fa73fe9d3cc38dfd7674f1e8fbbb73e10f94e88c416" gracePeriod=30 Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.792511 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="ceilometer-notification-agent" containerID="cri-o://c4e68fcc28ecfac35293b20d98718f8d9bd5887827f8d05943e69a3c86982cca" gracePeriod=30 Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.792632 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="sg-core" containerID="cri-o://e87c47477d5c78ebdbdc2aa39419c65f4b2322c0daed314d55cbfd23f7ccec6a" gracePeriod=30 Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.808229 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.136:3000/\": EOF" Dec 05 16:20:51 crc kubenswrapper[4778]: I1205 16:20:51.903170 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher9df0-account-delete-25tk7"] Dec 05 16:20:51 crc kubenswrapper[4778]: W1205 16:20:51.916283 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47c177ae_4cee_447f_a426_18ddcb4af8e7.slice/crio-9e03d577fbf094b7fd19179c426f974063a667673f6392ff5b3aa3e8841800c5 WatchSource:0}: Error finding container 9e03d577fbf094b7fd19179c426f974063a667673f6392ff5b3aa3e8841800c5: Status 404 returned error can't find the container with id 9e03d577fbf094b7fd19179c426f974063a667673f6392ff5b3aa3e8841800c5 Dec 05 16:20:52 crc kubenswrapper[4778]: I1205 16:20:52.017988 4778 generic.go:334] "Generic (PLEG): container finished" podID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerID="e87c47477d5c78ebdbdc2aa39419c65f4b2322c0daed314d55cbfd23f7ccec6a" exitCode=2 Dec 05 16:20:52 crc kubenswrapper[4778]: I1205 16:20:52.018043 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239","Type":"ContainerDied","Data":"e87c47477d5c78ebdbdc2aa39419c65f4b2322c0daed314d55cbfd23f7ccec6a"} Dec 05 16:20:52 crc kubenswrapper[4778]: I1205 16:20:52.024007 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" event={"ID":"47c177ae-4cee-447f-a426-18ddcb4af8e7","Type":"ContainerStarted","Data":"9e03d577fbf094b7fd19179c426f974063a667673f6392ff5b3aa3e8841800c5"} Dec 05 16:20:52 crc kubenswrapper[4778]: I1205 16:20:52.028819 4778 generic.go:334] "Generic (PLEG): container finished" podID="7001d061-fad8-4b0b-b9ee-fa1eac930efd" containerID="96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef" exitCode=143 Dec 05 16:20:52 crc kubenswrapper[4778]: I1205 16:20:52.028913 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7001d061-fad8-4b0b-b9ee-fa1eac930efd","Type":"ContainerDied","Data":"96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef"} Dec 05 16:20:52 crc kubenswrapper[4778]: I1205 16:20:52.029008 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="fae6197d-57c4-4329-a982-2a458d366dcc" containerName="watcher-decision-engine" containerID="cri-o://6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a" gracePeriod=30 Dec 05 16:20:52 crc kubenswrapper[4778]: I1205 16:20:52.820703 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.020596 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-custom-prometheus-ca\") pod \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.020658 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-combined-ca-bundle\") pod \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.020777 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7001d061-fad8-4b0b-b9ee-fa1eac930efd-logs\") pod \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.020840 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-config-data\") pod \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.020961 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqpj2\" (UniqueName: \"kubernetes.io/projected/7001d061-fad8-4b0b-b9ee-fa1eac930efd-kube-api-access-bqpj2\") pod \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\" (UID: \"7001d061-fad8-4b0b-b9ee-fa1eac930efd\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.021425 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7001d061-fad8-4b0b-b9ee-fa1eac930efd-logs" (OuterVolumeSpecName: "logs") pod "7001d061-fad8-4b0b-b9ee-fa1eac930efd" (UID: "7001d061-fad8-4b0b-b9ee-fa1eac930efd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.030197 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7001d061-fad8-4b0b-b9ee-fa1eac930efd-kube-api-access-bqpj2" (OuterVolumeSpecName: "kube-api-access-bqpj2") pod "7001d061-fad8-4b0b-b9ee-fa1eac930efd" (UID: "7001d061-fad8-4b0b-b9ee-fa1eac930efd"). InnerVolumeSpecName "kube-api-access-bqpj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.047325 4778 generic.go:334] "Generic (PLEG): container finished" podID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerID="f9a0f26a5a8ad608f4a13fa73fe9d3cc38dfd7674f1e8fbbb73e10f94e88c416" exitCode=0 Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.047397 4778 generic.go:334] "Generic (PLEG): container finished" podID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerID="c4e68fcc28ecfac35293b20d98718f8d9bd5887827f8d05943e69a3c86982cca" exitCode=0 Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.047409 4778 generic.go:334] "Generic (PLEG): container finished" podID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerID="2070a19c563e43128aa425f8bf3e1fd5b285bf3616e96a1fa4de19e3fae118c5" exitCode=0 Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.047486 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239","Type":"ContainerDied","Data":"f9a0f26a5a8ad608f4a13fa73fe9d3cc38dfd7674f1e8fbbb73e10f94e88c416"} Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.047516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239","Type":"ContainerDied","Data":"c4e68fcc28ecfac35293b20d98718f8d9bd5887827f8d05943e69a3c86982cca"} Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.047528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239","Type":"ContainerDied","Data":"2070a19c563e43128aa425f8bf3e1fd5b285bf3616e96a1fa4de19e3fae118c5"} Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.060394 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7001d061-fad8-4b0b-b9ee-fa1eac930efd" (UID: "7001d061-fad8-4b0b-b9ee-fa1eac930efd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.060975 4778 generic.go:334] "Generic (PLEG): container finished" podID="47c177ae-4cee-447f-a426-18ddcb4af8e7" containerID="18e7a51224dad800e7e180f4b2341dde27c565a5a3bf48e60cb662f7dedd18fd" exitCode=0 Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.061048 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" event={"ID":"47c177ae-4cee-447f-a426-18ddcb4af8e7","Type":"ContainerDied","Data":"18e7a51224dad800e7e180f4b2341dde27c565a5a3bf48e60cb662f7dedd18fd"} Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.065525 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "7001d061-fad8-4b0b-b9ee-fa1eac930efd" (UID: "7001d061-fad8-4b0b-b9ee-fa1eac930efd"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.081475 4778 generic.go:334] "Generic (PLEG): container finished" podID="7001d061-fad8-4b0b-b9ee-fa1eac930efd" containerID="4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3" exitCode=0 Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.081534 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7001d061-fad8-4b0b-b9ee-fa1eac930efd","Type":"ContainerDied","Data":"4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3"} Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.081565 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7001d061-fad8-4b0b-b9ee-fa1eac930efd","Type":"ContainerDied","Data":"2fda272f096317bee895d4dd39433568b34c1951bfff23a17b3fb1ceb2733d82"} Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.081585 4778 scope.go:117] "RemoveContainer" containerID="4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.081830 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.101530 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-config-data" (OuterVolumeSpecName: "config-data") pod "7001d061-fad8-4b0b-b9ee-fa1eac930efd" (UID: "7001d061-fad8-4b0b-b9ee-fa1eac930efd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.122475 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqpj2\" (UniqueName: \"kubernetes.io/projected/7001d061-fad8-4b0b-b9ee-fa1eac930efd-kube-api-access-bqpj2\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.122511 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.122526 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.122537 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7001d061-fad8-4b0b-b9ee-fa1eac930efd-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.122548 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7001d061-fad8-4b0b-b9ee-fa1eac930efd-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.139928 4778 scope.go:117] "RemoveContainer" containerID="96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.186818 4778 scope.go:117] "RemoveContainer" containerID="4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3" Dec 05 16:20:53 crc kubenswrapper[4778]: E1205 16:20:53.190793 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3\": container with ID starting with 4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3 not found: ID does not exist" containerID="4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.190829 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3"} err="failed to get container status \"4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3\": rpc error: code = NotFound desc = could not find container \"4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3\": container with ID starting with 4b3c6d4a760322390df8f74df77e63670a1e2f65282d793442f97d59c297aed3 not found: ID does not exist" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.190854 4778 scope.go:117] "RemoveContainer" containerID="96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef" Dec 05 16:20:53 crc kubenswrapper[4778]: E1205 16:20:53.191088 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef\": container with ID starting with 96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef not found: ID does not exist" containerID="96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.191108 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef"} err="failed to get container status \"96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef\": rpc error: code = NotFound desc = could not find container \"96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef\": container with ID starting with 96f9b2d61238b0d70f807169dbd95bef4485c82c926e9b0a42d1ee775709f5ef not found: ID does not exist" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.226735 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:53 crc kubenswrapper[4778]: E1205 16:20:53.393212 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:20:53 crc kubenswrapper[4778]: E1205 16:20:53.395232 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:20:53 crc kubenswrapper[4778]: E1205 16:20:53.397847 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:20:53 crc kubenswrapper[4778]: E1205 16:20:53.397885 4778 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="0536ee06-b83a-4947-9b31-40308c6ccb7a" containerName="watcher-applier" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.415502 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.425609 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.427984 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-run-httpd\") pod \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.428016 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-log-httpd\") pod \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.428076 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-sg-core-conf-yaml\") pod \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.428116 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-scripts\") pod \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.428143 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-config-data\") pod \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.428611 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" (UID: "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.428787 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-ceilometer-tls-certs\") pod \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.428814 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq8dj\" (UniqueName: \"kubernetes.io/projected/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-kube-api-access-cq8dj\") pod \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.428844 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-combined-ca-bundle\") pod \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\" (UID: \"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239\") " Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.428855 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" (UID: "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.429129 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.429148 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.447483 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-scripts" (OuterVolumeSpecName: "scripts") pod "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" (UID: "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.447537 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-kube-api-access-cq8dj" (OuterVolumeSpecName: "kube-api-access-cq8dj") pod "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" (UID: "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239"). InnerVolumeSpecName "kube-api-access-cq8dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.470937 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" (UID: "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.496301 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" (UID: "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.520519 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-config-data" (OuterVolumeSpecName: "config-data") pod "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" (UID: "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.520547 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" (UID: "d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.529949 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.530002 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.530012 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.530020 4778 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.530029 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq8dj\" (UniqueName: \"kubernetes.io/projected/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-kube-api-access-cq8dj\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:53 crc kubenswrapper[4778]: I1205 16:20:53.530039 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.128087 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.130463 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239","Type":"ContainerDied","Data":"b9d71dda09589b57491e5fb7c551547a0ee5ef2e298eece276f92633e6e8f30f"} Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.130525 4778 scope.go:117] "RemoveContainer" containerID="f9a0f26a5a8ad608f4a13fa73fe9d3cc38dfd7674f1e8fbbb73e10f94e88c416" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.182565 4778 scope.go:117] "RemoveContainer" containerID="e87c47477d5c78ebdbdc2aa39419c65f4b2322c0daed314d55cbfd23f7ccec6a" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.201491 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.221770 4778 scope.go:117] "RemoveContainer" containerID="c4e68fcc28ecfac35293b20d98718f8d9bd5887827f8d05943e69a3c86982cca" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.229886 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.268701 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:54 crc kubenswrapper[4778]: E1205 16:20:54.269051 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="proxy-httpd" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.269064 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="proxy-httpd" Dec 05 16:20:54 crc kubenswrapper[4778]: E1205 16:20:54.269076 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="ceilometer-notification-agent" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.269082 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="ceilometer-notification-agent" Dec 05 16:20:54 crc kubenswrapper[4778]: E1205 16:20:54.269091 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7001d061-fad8-4b0b-b9ee-fa1eac930efd" containerName="watcher-api" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.269097 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7001d061-fad8-4b0b-b9ee-fa1eac930efd" containerName="watcher-api" Dec 05 16:20:54 crc kubenswrapper[4778]: E1205 16:20:54.269111 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="sg-core" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.269116 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="sg-core" Dec 05 16:20:54 crc kubenswrapper[4778]: E1205 16:20:54.269126 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7001d061-fad8-4b0b-b9ee-fa1eac930efd" containerName="watcher-kuttl-api-log" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.269132 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7001d061-fad8-4b0b-b9ee-fa1eac930efd" containerName="watcher-kuttl-api-log" Dec 05 16:20:54 crc kubenswrapper[4778]: E1205 16:20:54.269147 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="ceilometer-central-agent" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.269153 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="ceilometer-central-agent" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.269288 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="ceilometer-notification-agent" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.269299 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7001d061-fad8-4b0b-b9ee-fa1eac930efd" containerName="watcher-api" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.269310 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="proxy-httpd" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.269326 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="ceilometer-central-agent" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.269334 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" containerName="sg-core" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.269346 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7001d061-fad8-4b0b-b9ee-fa1eac930efd" containerName="watcher-kuttl-api-log" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.270765 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.276808 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.277000 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.285341 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.288680 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.301553 4778 scope.go:117] "RemoveContainer" containerID="2070a19c563e43128aa425f8bf3e1fd5b285bf3616e96a1fa4de19e3fae118c5" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.448241 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.448313 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.448428 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.448842 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-scripts\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.449037 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8dbe352-c702-4ea0-abd4-a1caf21004cc-run-httpd\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.449351 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8dbe352-c702-4ea0-abd4-a1caf21004cc-log-httpd\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.449520 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljfll\" (UniqueName: \"kubernetes.io/projected/e8dbe352-c702-4ea0-abd4-a1caf21004cc-kube-api-access-ljfll\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.449637 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-config-data\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.550843 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.550893 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-scripts\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.550936 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8dbe352-c702-4ea0-abd4-a1caf21004cc-run-httpd\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.550964 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8dbe352-c702-4ea0-abd4-a1caf21004cc-log-httpd\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.550994 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljfll\" (UniqueName: \"kubernetes.io/projected/e8dbe352-c702-4ea0-abd4-a1caf21004cc-kube-api-access-ljfll\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.551015 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-config-data\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.551037 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.551060 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.551816 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8dbe352-c702-4ea0-abd4-a1caf21004cc-log-httpd\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.552279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8dbe352-c702-4ea0-abd4-a1caf21004cc-run-httpd\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.556072 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.556146 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.556305 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.556337 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.556488 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-scripts\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.556553 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8dbe352-c702-4ea0-abd4-a1caf21004cc-config-data\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.577088 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljfll\" (UniqueName: \"kubernetes.io/projected/e8dbe352-c702-4ea0-abd4-a1caf21004cc-kube-api-access-ljfll\") pod \"ceilometer-0\" (UID: \"e8dbe352-c702-4ea0-abd4-a1caf21004cc\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.610730 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.754410 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47c177ae-4cee-447f-a426-18ddcb4af8e7-operator-scripts\") pod \"47c177ae-4cee-447f-a426-18ddcb4af8e7\" (UID: \"47c177ae-4cee-447f-a426-18ddcb4af8e7\") " Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.754748 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpz78\" (UniqueName: \"kubernetes.io/projected/47c177ae-4cee-447f-a426-18ddcb4af8e7-kube-api-access-cpz78\") pod \"47c177ae-4cee-447f-a426-18ddcb4af8e7\" (UID: \"47c177ae-4cee-447f-a426-18ddcb4af8e7\") " Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.755311 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47c177ae-4cee-447f-a426-18ddcb4af8e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47c177ae-4cee-447f-a426-18ddcb4af8e7" (UID: "47c177ae-4cee-447f-a426-18ddcb4af8e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.760599 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47c177ae-4cee-447f-a426-18ddcb4af8e7-kube-api-access-cpz78" (OuterVolumeSpecName: "kube-api-access-cpz78") pod "47c177ae-4cee-447f-a426-18ddcb4af8e7" (UID: "47c177ae-4cee-447f-a426-18ddcb4af8e7"). InnerVolumeSpecName "kube-api-access-cpz78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.856206 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpz78\" (UniqueName: \"kubernetes.io/projected/47c177ae-4cee-447f-a426-18ddcb4af8e7-kube-api-access-cpz78\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:54 crc kubenswrapper[4778]: I1205 16:20:54.856243 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47c177ae-4cee-447f-a426-18ddcb4af8e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.099879 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 16:20:55 crc kubenswrapper[4778]: W1205 16:20:55.105653 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8dbe352_c702_4ea0_abd4_a1caf21004cc.slice/crio-980f096e4f568339d1ca75d8d02fc0fe2fbec92eee6e6b8598c638cb3c628de4 WatchSource:0}: Error finding container 980f096e4f568339d1ca75d8d02fc0fe2fbec92eee6e6b8598c638cb3c628de4: Status 404 returned error can't find the container with id 980f096e4f568339d1ca75d8d02fc0fe2fbec92eee6e6b8598c638cb3c628de4 Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.138796 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.138805 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher9df0-account-delete-25tk7" event={"ID":"47c177ae-4cee-447f-a426-18ddcb4af8e7","Type":"ContainerDied","Data":"9e03d577fbf094b7fd19179c426f974063a667673f6392ff5b3aa3e8841800c5"} Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.138877 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e03d577fbf094b7fd19179c426f974063a667673f6392ff5b3aa3e8841800c5" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.140279 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8dbe352-c702-4ea0-abd4-a1caf21004cc","Type":"ContainerStarted","Data":"980f096e4f568339d1ca75d8d02fc0fe2fbec92eee6e6b8598c638cb3c628de4"} Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.259591 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7001d061-fad8-4b0b-b9ee-fa1eac930efd" path="/var/lib/kubelet/pods/7001d061-fad8-4b0b-b9ee-fa1eac930efd/volumes" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.260296 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239" path="/var/lib/kubelet/pods/d8b241d2-6ccb-4b8a-96f6-2dc5d4e66239/volumes" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.804589 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.873670 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lvs8\" (UniqueName: \"kubernetes.io/projected/0536ee06-b83a-4947-9b31-40308c6ccb7a-kube-api-access-5lvs8\") pod \"0536ee06-b83a-4947-9b31-40308c6ccb7a\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.873788 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0536ee06-b83a-4947-9b31-40308c6ccb7a-combined-ca-bundle\") pod \"0536ee06-b83a-4947-9b31-40308c6ccb7a\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.873882 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0536ee06-b83a-4947-9b31-40308c6ccb7a-logs\") pod \"0536ee06-b83a-4947-9b31-40308c6ccb7a\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.873902 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0536ee06-b83a-4947-9b31-40308c6ccb7a-config-data\") pod \"0536ee06-b83a-4947-9b31-40308c6ccb7a\" (UID: \"0536ee06-b83a-4947-9b31-40308c6ccb7a\") " Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.874528 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0536ee06-b83a-4947-9b31-40308c6ccb7a-logs" (OuterVolumeSpecName: "logs") pod "0536ee06-b83a-4947-9b31-40308c6ccb7a" (UID: "0536ee06-b83a-4947-9b31-40308c6ccb7a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.880580 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0536ee06-b83a-4947-9b31-40308c6ccb7a-kube-api-access-5lvs8" (OuterVolumeSpecName: "kube-api-access-5lvs8") pod "0536ee06-b83a-4947-9b31-40308c6ccb7a" (UID: "0536ee06-b83a-4947-9b31-40308c6ccb7a"). InnerVolumeSpecName "kube-api-access-5lvs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.913860 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0536ee06-b83a-4947-9b31-40308c6ccb7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0536ee06-b83a-4947-9b31-40308c6ccb7a" (UID: "0536ee06-b83a-4947-9b31-40308c6ccb7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.920463 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0536ee06-b83a-4947-9b31-40308c6ccb7a-config-data" (OuterVolumeSpecName: "config-data") pod "0536ee06-b83a-4947-9b31-40308c6ccb7a" (UID: "0536ee06-b83a-4947-9b31-40308c6ccb7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.975023 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lvs8\" (UniqueName: \"kubernetes.io/projected/0536ee06-b83a-4947-9b31-40308c6ccb7a-kube-api-access-5lvs8\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.975056 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0536ee06-b83a-4947-9b31-40308c6ccb7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.975069 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0536ee06-b83a-4947-9b31-40308c6ccb7a-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4778]: I1205 16:20:55.975079 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0536ee06-b83a-4947-9b31-40308c6ccb7a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.117573 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-hbv72"] Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.129098 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-hbv72"] Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.139172 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2"] Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.150061 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher9df0-account-delete-25tk7"] Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.150570 4778 generic.go:334] "Generic (PLEG): container finished" podID="0536ee06-b83a-4947-9b31-40308c6ccb7a" containerID="daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673" exitCode=0 Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.150637 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"0536ee06-b83a-4947-9b31-40308c6ccb7a","Type":"ContainerDied","Data":"daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673"} Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.150674 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"0536ee06-b83a-4947-9b31-40308c6ccb7a","Type":"ContainerDied","Data":"db18787261dfd6630a7fb7b4530c861738fc6649d84761d34b5677ad001eab1a"} Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.150698 4778 scope.go:117] "RemoveContainer" containerID="daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673" Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.150853 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.152856 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8dbe352-c702-4ea0-abd4-a1caf21004cc","Type":"ContainerStarted","Data":"ae363e74ba6e34a961cf15195cbb7c86827b069c671a18ac7e6a39c00f519be9"} Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.162193 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher9df0-account-delete-25tk7"] Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.169543 4778 scope.go:117] "RemoveContainer" containerID="daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673" Dec 05 16:20:56 crc kubenswrapper[4778]: E1205 16:20:56.170610 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673\": container with ID starting with daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673 not found: ID does not exist" containerID="daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673" Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.170660 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673"} err="failed to get container status \"daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673\": rpc error: code = NotFound desc = could not find container \"daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673\": container with ID starting with daa6f46dbf546eb0884aef09a18267219da7b870ec364b8583b15d201203d673 not found: ID does not exist" Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.172509 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-9df0-account-create-update-qjjx2"] Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.198790 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:20:56 crc kubenswrapper[4778]: I1205 16:20:56.209669 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:20:57 crc kubenswrapper[4778]: I1205 16:20:57.162221 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8dbe352-c702-4ea0-abd4-a1caf21004cc","Type":"ContainerStarted","Data":"275150939e0823e5e0333eb556470949593091c637fba51a68c06c47e9201421"} Dec 05 16:20:57 crc kubenswrapper[4778]: I1205 16:20:57.162930 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8dbe352-c702-4ea0-abd4-a1caf21004cc","Type":"ContainerStarted","Data":"1072e61ba59df3437e6a328932f4ce1d7cf2f1405b6f5141c8fd0a8ca1ffcf56"} Dec 05 16:20:57 crc kubenswrapper[4778]: I1205 16:20:57.250202 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:20:57 crc kubenswrapper[4778]: E1205 16:20:57.250454 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:20:57 crc kubenswrapper[4778]: I1205 16:20:57.260870 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0536ee06-b83a-4947-9b31-40308c6ccb7a" path="/var/lib/kubelet/pods/0536ee06-b83a-4947-9b31-40308c6ccb7a/volumes" Dec 05 16:20:57 crc kubenswrapper[4778]: I1205 16:20:57.261524 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47c177ae-4cee-447f-a426-18ddcb4af8e7" path="/var/lib/kubelet/pods/47c177ae-4cee-447f-a426-18ddcb4af8e7/volumes" Dec 05 16:20:57 crc kubenswrapper[4778]: I1205 16:20:57.261977 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc76cf1b-e970-4771-bd28-09ad798f33e5" path="/var/lib/kubelet/pods/dc76cf1b-e970-4771-bd28-09ad798f33e5/volumes" Dec 05 16:20:57 crc kubenswrapper[4778]: I1205 16:20:57.262876 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb9e88fb-e9c7-42a6-aa64-44d84d7317f6" path="/var/lib/kubelet/pods/fb9e88fb-e9c7-42a6-aa64-44d84d7317f6/volumes" Dec 05 16:20:57 crc kubenswrapper[4778]: E1205 16:20:57.842552 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfae6197d_57c4_4329_a982_2a458d366dcc.slice/crio-6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.181026 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.184583 4778 generic.go:334] "Generic (PLEG): container finished" podID="fae6197d-57c4-4329-a982-2a458d366dcc" containerID="6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a" exitCode=0 Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.184630 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"fae6197d-57c4-4329-a982-2a458d366dcc","Type":"ContainerDied","Data":"6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a"} Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.184658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"fae6197d-57c4-4329-a982-2a458d366dcc","Type":"ContainerDied","Data":"2894af30b1451093add8e2ba124242ebbe4ced9d6446a327bb9e48eab5bd9b77"} Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.184677 4778 scope.go:117] "RemoveContainer" containerID="6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.184802 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.216188 4778 scope.go:117] "RemoveContainer" containerID="6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a" Dec 05 16:20:58 crc kubenswrapper[4778]: E1205 16:20:58.217150 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a\": container with ID starting with 6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a not found: ID does not exist" containerID="6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.217205 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a"} err="failed to get container status \"6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a\": rpc error: code = NotFound desc = could not find container \"6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a\": container with ID starting with 6067a52ecbc55cdcfbcacada60b78f344ddf427abe99dd431e693fa12f67555a not found: ID does not exist" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.313216 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqncz\" (UniqueName: \"kubernetes.io/projected/fae6197d-57c4-4329-a982-2a458d366dcc-kube-api-access-xqncz\") pod \"fae6197d-57c4-4329-a982-2a458d366dcc\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.313272 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-custom-prometheus-ca\") pod \"fae6197d-57c4-4329-a982-2a458d366dcc\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.313307 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae6197d-57c4-4329-a982-2a458d366dcc-logs\") pod \"fae6197d-57c4-4329-a982-2a458d366dcc\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.313442 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-combined-ca-bundle\") pod \"fae6197d-57c4-4329-a982-2a458d366dcc\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.313568 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-config-data\") pod \"fae6197d-57c4-4329-a982-2a458d366dcc\" (UID: \"fae6197d-57c4-4329-a982-2a458d366dcc\") " Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.315068 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae6197d-57c4-4329-a982-2a458d366dcc-logs" (OuterVolumeSpecName: "logs") pod "fae6197d-57c4-4329-a982-2a458d366dcc" (UID: "fae6197d-57c4-4329-a982-2a458d366dcc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.319590 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae6197d-57c4-4329-a982-2a458d366dcc-kube-api-access-xqncz" (OuterVolumeSpecName: "kube-api-access-xqncz") pod "fae6197d-57c4-4329-a982-2a458d366dcc" (UID: "fae6197d-57c4-4329-a982-2a458d366dcc"). InnerVolumeSpecName "kube-api-access-xqncz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.344522 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fae6197d-57c4-4329-a982-2a458d366dcc" (UID: "fae6197d-57c4-4329-a982-2a458d366dcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.348256 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "fae6197d-57c4-4329-a982-2a458d366dcc" (UID: "fae6197d-57c4-4329-a982-2a458d366dcc"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.383540 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-config-data" (OuterVolumeSpecName: "config-data") pod "fae6197d-57c4-4329-a982-2a458d366dcc" (UID: "fae6197d-57c4-4329-a982-2a458d366dcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.415874 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.415902 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.415911 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqncz\" (UniqueName: \"kubernetes.io/projected/fae6197d-57c4-4329-a982-2a458d366dcc-kube-api-access-xqncz\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.415923 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fae6197d-57c4-4329-a982-2a458d366dcc-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.415931 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae6197d-57c4-4329-a982-2a458d366dcc-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.522962 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:20:58 crc kubenswrapper[4778]: I1205 16:20:58.534426 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:20:59 crc kubenswrapper[4778]: I1205 16:20:59.195411 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e8dbe352-c702-4ea0-abd4-a1caf21004cc","Type":"ContainerStarted","Data":"38ebddc5b6ec39f61bda74d99721e89a1416a684e6bbf245a2abf9282c3f97d1"} Dec 05 16:20:59 crc kubenswrapper[4778]: I1205 16:20:59.196492 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:20:59 crc kubenswrapper[4778]: I1205 16:20:59.225245 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.3163039899999998 podStartE2EDuration="5.225227302s" podCreationTimestamp="2025-12-05 16:20:54 +0000 UTC" firstStartedPulling="2025-12-05 16:20:55.108186006 +0000 UTC m=+1542.211982376" lastFinishedPulling="2025-12-05 16:20:58.017109298 +0000 UTC m=+1545.120905688" observedRunningTime="2025-12-05 16:20:59.220011293 +0000 UTC m=+1546.323807683" watchObservedRunningTime="2025-12-05 16:20:59.225227302 +0000 UTC m=+1546.329023672" Dec 05 16:20:59 crc kubenswrapper[4778]: I1205 16:20:59.260140 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fae6197d-57c4-4329-a982-2a458d366dcc" path="/var/lib/kubelet/pods/fae6197d-57c4-4329-a982-2a458d366dcc/volumes" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.093657 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wrfrw"] Dec 05 16:21:09 crc kubenswrapper[4778]: E1205 16:21:09.094428 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0536ee06-b83a-4947-9b31-40308c6ccb7a" containerName="watcher-applier" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.094440 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0536ee06-b83a-4947-9b31-40308c6ccb7a" containerName="watcher-applier" Dec 05 16:21:09 crc kubenswrapper[4778]: E1205 16:21:09.094456 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae6197d-57c4-4329-a982-2a458d366dcc" containerName="watcher-decision-engine" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.094462 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae6197d-57c4-4329-a982-2a458d366dcc" containerName="watcher-decision-engine" Dec 05 16:21:09 crc kubenswrapper[4778]: E1205 16:21:09.094474 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c177ae-4cee-447f-a426-18ddcb4af8e7" containerName="mariadb-account-delete" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.094481 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c177ae-4cee-447f-a426-18ddcb4af8e7" containerName="mariadb-account-delete" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.094641 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0536ee06-b83a-4947-9b31-40308c6ccb7a" containerName="watcher-applier" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.094654 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae6197d-57c4-4329-a982-2a458d366dcc" containerName="watcher-decision-engine" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.094662 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c177ae-4cee-447f-a426-18ddcb4af8e7" containerName="mariadb-account-delete" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.096085 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.110902 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrfrw"] Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.287109 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-utilities\") pod \"community-operators-wrfrw\" (UID: \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\") " pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.288207 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2j58\" (UniqueName: \"kubernetes.io/projected/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-kube-api-access-h2j58\") pod \"community-operators-wrfrw\" (UID: \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\") " pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.288320 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-catalog-content\") pod \"community-operators-wrfrw\" (UID: \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\") " pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.391995 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2j58\" (UniqueName: \"kubernetes.io/projected/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-kube-api-access-h2j58\") pod \"community-operators-wrfrw\" (UID: \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\") " pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.392493 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-catalog-content\") pod \"community-operators-wrfrw\" (UID: \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\") " pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.393533 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-catalog-content\") pod \"community-operators-wrfrw\" (UID: \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\") " pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.394675 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-utilities\") pod \"community-operators-wrfrw\" (UID: \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\") " pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.395105 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-utilities\") pod \"community-operators-wrfrw\" (UID: \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\") " pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.412792 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2j58\" (UniqueName: \"kubernetes.io/projected/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-kube-api-access-h2j58\") pod \"community-operators-wrfrw\" (UID: \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\") " pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.418754 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:09 crc kubenswrapper[4778]: I1205 16:21:09.894704 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrfrw"] Dec 05 16:21:10 crc kubenswrapper[4778]: I1205 16:21:10.302055 4778 generic.go:334] "Generic (PLEG): container finished" podID="3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" containerID="89ea5dd894a043fa4a095f9633f39c2149e08f68239c033c99d0c5a5328ea357" exitCode=0 Dec 05 16:21:10 crc kubenswrapper[4778]: I1205 16:21:10.302279 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrfrw" event={"ID":"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f","Type":"ContainerDied","Data":"89ea5dd894a043fa4a095f9633f39c2149e08f68239c033c99d0c5a5328ea357"} Dec 05 16:21:10 crc kubenswrapper[4778]: I1205 16:21:10.302429 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrfrw" event={"ID":"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f","Type":"ContainerStarted","Data":"90d5a4b071562d7002777b8dc1c476e02eefeba4a933e008f6b6424eb5aa66b4"} Dec 05 16:21:11 crc kubenswrapper[4778]: I1205 16:21:11.249004 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:21:11 crc kubenswrapper[4778]: E1205 16:21:11.249265 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:21:12 crc kubenswrapper[4778]: I1205 16:21:12.317949 4778 generic.go:334] "Generic (PLEG): container finished" podID="3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" containerID="b44002ba1d83034833412d9f9a45f4485798e4323ff94545f47ce2b6b542a8b9" exitCode=0 Dec 05 16:21:12 crc kubenswrapper[4778]: I1205 16:21:12.317993 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrfrw" event={"ID":"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f","Type":"ContainerDied","Data":"b44002ba1d83034833412d9f9a45f4485798e4323ff94545f47ce2b6b542a8b9"} Dec 05 16:21:13 crc kubenswrapper[4778]: I1205 16:21:13.327851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrfrw" event={"ID":"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f","Type":"ContainerStarted","Data":"87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c"} Dec 05 16:21:13 crc kubenswrapper[4778]: I1205 16:21:13.348498 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wrfrw" podStartSLOduration=1.944163785 podStartE2EDuration="4.348475006s" podCreationTimestamp="2025-12-05 16:21:09 +0000 UTC" firstStartedPulling="2025-12-05 16:21:10.304330526 +0000 UTC m=+1557.408126906" lastFinishedPulling="2025-12-05 16:21:12.708641747 +0000 UTC m=+1559.812438127" observedRunningTime="2025-12-05 16:21:13.346754011 +0000 UTC m=+1560.450550411" watchObservedRunningTime="2025-12-05 16:21:13.348475006 +0000 UTC m=+1560.452271406" Dec 05 16:21:19 crc kubenswrapper[4778]: I1205 16:21:19.419141 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:19 crc kubenswrapper[4778]: I1205 16:21:19.419868 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:19 crc kubenswrapper[4778]: I1205 16:21:19.470092 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:20 crc kubenswrapper[4778]: I1205 16:21:20.440154 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:23 crc kubenswrapper[4778]: I1205 16:21:23.084756 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wrfrw"] Dec 05 16:21:23 crc kubenswrapper[4778]: I1205 16:21:23.085266 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wrfrw" podUID="3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" containerName="registry-server" containerID="cri-o://87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c" gracePeriod=2 Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.083906 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.221814 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-catalog-content\") pod \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\" (UID: \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\") " Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.221908 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2j58\" (UniqueName: \"kubernetes.io/projected/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-kube-api-access-h2j58\") pod \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\" (UID: \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\") " Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.221980 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-utilities\") pod \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\" (UID: \"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f\") " Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.222958 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-utilities" (OuterVolumeSpecName: "utilities") pod "3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" (UID: "3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.229624 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-kube-api-access-h2j58" (OuterVolumeSpecName: "kube-api-access-h2j58") pod "3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" (UID: "3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f"). InnerVolumeSpecName "kube-api-access-h2j58". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.249812 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:21:24 crc kubenswrapper[4778]: E1205 16:21:24.250056 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.274285 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" (UID: "3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.325569 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.325603 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2j58\" (UniqueName: \"kubernetes.io/projected/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-kube-api-access-h2j58\") on node \"crc\" DevicePath \"\"" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.325617 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.413040 4778 generic.go:334] "Generic (PLEG): container finished" podID="3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" containerID="87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c" exitCode=0 Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.413085 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrfrw" event={"ID":"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f","Type":"ContainerDied","Data":"87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c"} Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.413116 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrfrw" event={"ID":"3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f","Type":"ContainerDied","Data":"90d5a4b071562d7002777b8dc1c476e02eefeba4a933e008f6b6424eb5aa66b4"} Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.413140 4778 scope.go:117] "RemoveContainer" containerID="87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.413153 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrfrw" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.442928 4778 scope.go:117] "RemoveContainer" containerID="b44002ba1d83034833412d9f9a45f4485798e4323ff94545f47ce2b6b542a8b9" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.445455 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wrfrw"] Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.455061 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wrfrw"] Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.462602 4778 scope.go:117] "RemoveContainer" containerID="89ea5dd894a043fa4a095f9633f39c2149e08f68239c033c99d0c5a5328ea357" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.497447 4778 scope.go:117] "RemoveContainer" containerID="87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c" Dec 05 16:21:24 crc kubenswrapper[4778]: E1205 16:21:24.497798 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c\": container with ID starting with 87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c not found: ID does not exist" containerID="87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.497830 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c"} err="failed to get container status \"87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c\": rpc error: code = NotFound desc = could not find container \"87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c\": container with ID starting with 87b3b09cf1692eaa5e0f7812c6b65cade6e91ab7d283457a669634283e4a295c not found: ID does not exist" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.497849 4778 scope.go:117] "RemoveContainer" containerID="b44002ba1d83034833412d9f9a45f4485798e4323ff94545f47ce2b6b542a8b9" Dec 05 16:21:24 crc kubenswrapper[4778]: E1205 16:21:24.498076 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b44002ba1d83034833412d9f9a45f4485798e4323ff94545f47ce2b6b542a8b9\": container with ID starting with b44002ba1d83034833412d9f9a45f4485798e4323ff94545f47ce2b6b542a8b9 not found: ID does not exist" containerID="b44002ba1d83034833412d9f9a45f4485798e4323ff94545f47ce2b6b542a8b9" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.498094 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44002ba1d83034833412d9f9a45f4485798e4323ff94545f47ce2b6b542a8b9"} err="failed to get container status \"b44002ba1d83034833412d9f9a45f4485798e4323ff94545f47ce2b6b542a8b9\": rpc error: code = NotFound desc = could not find container \"b44002ba1d83034833412d9f9a45f4485798e4323ff94545f47ce2b6b542a8b9\": container with ID starting with b44002ba1d83034833412d9f9a45f4485798e4323ff94545f47ce2b6b542a8b9 not found: ID does not exist" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.498107 4778 scope.go:117] "RemoveContainer" containerID="89ea5dd894a043fa4a095f9633f39c2149e08f68239c033c99d0c5a5328ea357" Dec 05 16:21:24 crc kubenswrapper[4778]: E1205 16:21:24.498318 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89ea5dd894a043fa4a095f9633f39c2149e08f68239c033c99d0c5a5328ea357\": container with ID starting with 89ea5dd894a043fa4a095f9633f39c2149e08f68239c033c99d0c5a5328ea357 not found: ID does not exist" containerID="89ea5dd894a043fa4a095f9633f39c2149e08f68239c033c99d0c5a5328ea357" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.498340 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89ea5dd894a043fa4a095f9633f39c2149e08f68239c033c99d0c5a5328ea357"} err="failed to get container status \"89ea5dd894a043fa4a095f9633f39c2149e08f68239c033c99d0c5a5328ea357\": rpc error: code = NotFound desc = could not find container \"89ea5dd894a043fa4a095f9633f39c2149e08f68239c033c99d0c5a5328ea357\": container with ID starting with 89ea5dd894a043fa4a095f9633f39c2149e08f68239c033c99d0c5a5328ea357 not found: ID does not exist" Dec 05 16:21:24 crc kubenswrapper[4778]: I1205 16:21:24.625993 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 05 16:21:25 crc kubenswrapper[4778]: I1205 16:21:25.259856 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" path="/var/lib/kubelet/pods/3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f/volumes" Dec 05 16:21:36 crc kubenswrapper[4778]: I1205 16:21:36.249451 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:21:36 crc kubenswrapper[4778]: E1205 16:21:36.250273 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:21:50 crc kubenswrapper[4778]: I1205 16:21:50.250210 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:21:50 crc kubenswrapper[4778]: E1205 16:21:50.250959 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:22:01 crc kubenswrapper[4778]: I1205 16:22:01.249902 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:22:01 crc kubenswrapper[4778]: E1205 16:22:01.251017 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:22:12 crc kubenswrapper[4778]: I1205 16:22:12.249329 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:22:12 crc kubenswrapper[4778]: E1205 16:22:12.249992 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:22:23 crc kubenswrapper[4778]: I1205 16:22:23.255274 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:22:23 crc kubenswrapper[4778]: E1205 16:22:23.256213 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:22:36 crc kubenswrapper[4778]: I1205 16:22:36.249823 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:22:36 crc kubenswrapper[4778]: E1205 16:22:36.251730 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:22:51 crc kubenswrapper[4778]: I1205 16:22:51.249846 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:22:51 crc kubenswrapper[4778]: E1205 16:22:51.251732 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:23:04 crc kubenswrapper[4778]: I1205 16:23:04.249826 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:23:04 crc kubenswrapper[4778]: E1205 16:23:04.252422 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.097150 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6gh7d"] Dec 05 16:23:08 crc kubenswrapper[4778]: E1205 16:23:08.098830 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" containerName="extract-content" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.098869 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" containerName="extract-content" Dec 05 16:23:08 crc kubenswrapper[4778]: E1205 16:23:08.098885 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" containerName="registry-server" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.098893 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" containerName="registry-server" Dec 05 16:23:08 crc kubenswrapper[4778]: E1205 16:23:08.098923 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" containerName="extract-utilities" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.098932 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" containerName="extract-utilities" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.099111 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a889b7c-5fc9-4cb8-b8c3-3fbaa3a9f69f" containerName="registry-server" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.100264 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.132935 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gh7d"] Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.162963 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-catalog-content\") pod \"redhat-marketplace-6gh7d\" (UID: \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\") " pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.163036 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5jgf\" (UniqueName: \"kubernetes.io/projected/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-kube-api-access-l5jgf\") pod \"redhat-marketplace-6gh7d\" (UID: \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\") " pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.163128 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-utilities\") pod \"redhat-marketplace-6gh7d\" (UID: \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\") " pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.264783 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-catalog-content\") pod \"redhat-marketplace-6gh7d\" (UID: \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\") " pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.264855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5jgf\" (UniqueName: \"kubernetes.io/projected/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-kube-api-access-l5jgf\") pod \"redhat-marketplace-6gh7d\" (UID: \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\") " pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.264952 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-utilities\") pod \"redhat-marketplace-6gh7d\" (UID: \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\") " pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.267990 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-utilities\") pod \"redhat-marketplace-6gh7d\" (UID: \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\") " pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.268503 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-catalog-content\") pod \"redhat-marketplace-6gh7d\" (UID: \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\") " pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.286476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5jgf\" (UniqueName: \"kubernetes.io/projected/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-kube-api-access-l5jgf\") pod \"redhat-marketplace-6gh7d\" (UID: \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\") " pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.462624 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:08 crc kubenswrapper[4778]: I1205 16:23:08.905005 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gh7d"] Dec 05 16:23:09 crc kubenswrapper[4778]: I1205 16:23:09.333342 4778 generic.go:334] "Generic (PLEG): container finished" podID="e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" containerID="795bd6cd8d6b73df6fd951f45c7b195c275f3cc4fb67d9479fbde76a1aa93880" exitCode=0 Dec 05 16:23:09 crc kubenswrapper[4778]: I1205 16:23:09.333552 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gh7d" event={"ID":"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd","Type":"ContainerDied","Data":"795bd6cd8d6b73df6fd951f45c7b195c275f3cc4fb67d9479fbde76a1aa93880"} Dec 05 16:23:09 crc kubenswrapper[4778]: I1205 16:23:09.333664 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gh7d" event={"ID":"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd","Type":"ContainerStarted","Data":"fbba2bd611a29853cb67ace7dcdace554ad48053fe718070fdce192feb97745c"} Dec 05 16:23:10 crc kubenswrapper[4778]: E1205 16:23:10.681534 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1162ce4_30c3_4ca9_aa3d_2899f8b151cd.slice/crio-2dde25d2304a9d4a4c8b5d176f6c30839db2b7f77e3dd4ce1c1cc436d892f7b4.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:23:11 crc kubenswrapper[4778]: I1205 16:23:11.354865 4778 generic.go:334] "Generic (PLEG): container finished" podID="e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" containerID="2dde25d2304a9d4a4c8b5d176f6c30839db2b7f77e3dd4ce1c1cc436d892f7b4" exitCode=0 Dec 05 16:23:11 crc kubenswrapper[4778]: I1205 16:23:11.354950 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gh7d" event={"ID":"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd","Type":"ContainerDied","Data":"2dde25d2304a9d4a4c8b5d176f6c30839db2b7f77e3dd4ce1c1cc436d892f7b4"} Dec 05 16:23:12 crc kubenswrapper[4778]: I1205 16:23:12.365018 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gh7d" event={"ID":"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd","Type":"ContainerStarted","Data":"610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9"} Dec 05 16:23:12 crc kubenswrapper[4778]: I1205 16:23:12.385142 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6gh7d" podStartSLOduration=1.946515066 podStartE2EDuration="4.385121651s" podCreationTimestamp="2025-12-05 16:23:08 +0000 UTC" firstStartedPulling="2025-12-05 16:23:09.335998427 +0000 UTC m=+1676.439794847" lastFinishedPulling="2025-12-05 16:23:11.774605052 +0000 UTC m=+1678.878401432" observedRunningTime="2025-12-05 16:23:12.383554589 +0000 UTC m=+1679.487350969" watchObservedRunningTime="2025-12-05 16:23:12.385121651 +0000 UTC m=+1679.488918031" Dec 05 16:23:18 crc kubenswrapper[4778]: I1205 16:23:18.463774 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:18 crc kubenswrapper[4778]: I1205 16:23:18.464574 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:18 crc kubenswrapper[4778]: I1205 16:23:18.525354 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:19 crc kubenswrapper[4778]: I1205 16:23:19.249262 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:23:19 crc kubenswrapper[4778]: E1205 16:23:19.249573 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:23:19 crc kubenswrapper[4778]: I1205 16:23:19.525450 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:22 crc kubenswrapper[4778]: I1205 16:23:22.088658 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gh7d"] Dec 05 16:23:22 crc kubenswrapper[4778]: I1205 16:23:22.089308 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6gh7d" podUID="e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" containerName="registry-server" containerID="cri-o://610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9" gracePeriod=2 Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.120671 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.308498 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5jgf\" (UniqueName: \"kubernetes.io/projected/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-kube-api-access-l5jgf\") pod \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\" (UID: \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\") " Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.308592 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-catalog-content\") pod \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\" (UID: \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\") " Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.308619 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-utilities\") pod \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\" (UID: \"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd\") " Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.311320 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-utilities" (OuterVolumeSpecName: "utilities") pod "e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" (UID: "e1162ce4-30c3-4ca9-aa3d-2899f8b151cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.315585 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-kube-api-access-l5jgf" (OuterVolumeSpecName: "kube-api-access-l5jgf") pod "e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" (UID: "e1162ce4-30c3-4ca9-aa3d-2899f8b151cd"). InnerVolumeSpecName "kube-api-access-l5jgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.351893 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" (UID: "e1162ce4-30c3-4ca9-aa3d-2899f8b151cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.411214 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5jgf\" (UniqueName: \"kubernetes.io/projected/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-kube-api-access-l5jgf\") on node \"crc\" DevicePath \"\"" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.411256 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.411265 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.493425 4778 generic.go:334] "Generic (PLEG): container finished" podID="e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" containerID="610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9" exitCode=0 Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.493465 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gh7d" event={"ID":"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd","Type":"ContainerDied","Data":"610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9"} Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.493490 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gh7d" event={"ID":"e1162ce4-30c3-4ca9-aa3d-2899f8b151cd","Type":"ContainerDied","Data":"fbba2bd611a29853cb67ace7dcdace554ad48053fe718070fdce192feb97745c"} Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.493506 4778 scope.go:117] "RemoveContainer" containerID="610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.493638 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6gh7d" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.518560 4778 scope.go:117] "RemoveContainer" containerID="2dde25d2304a9d4a4c8b5d176f6c30839db2b7f77e3dd4ce1c1cc436d892f7b4" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.525833 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gh7d"] Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.531789 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gh7d"] Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.551687 4778 scope.go:117] "RemoveContainer" containerID="795bd6cd8d6b73df6fd951f45c7b195c275f3cc4fb67d9479fbde76a1aa93880" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.577962 4778 scope.go:117] "RemoveContainer" containerID="610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9" Dec 05 16:23:23 crc kubenswrapper[4778]: E1205 16:23:23.578412 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9\": container with ID starting with 610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9 not found: ID does not exist" containerID="610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.578473 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9"} err="failed to get container status \"610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9\": rpc error: code = NotFound desc = could not find container \"610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9\": container with ID starting with 610ff4196363fbfec38635006fb3390fa08e5db06206052862f2b0c275cfb6b9 not found: ID does not exist" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.578504 4778 scope.go:117] "RemoveContainer" containerID="2dde25d2304a9d4a4c8b5d176f6c30839db2b7f77e3dd4ce1c1cc436d892f7b4" Dec 05 16:23:23 crc kubenswrapper[4778]: E1205 16:23:23.578880 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dde25d2304a9d4a4c8b5d176f6c30839db2b7f77e3dd4ce1c1cc436d892f7b4\": container with ID starting with 2dde25d2304a9d4a4c8b5d176f6c30839db2b7f77e3dd4ce1c1cc436d892f7b4 not found: ID does not exist" containerID="2dde25d2304a9d4a4c8b5d176f6c30839db2b7f77e3dd4ce1c1cc436d892f7b4" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.578908 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dde25d2304a9d4a4c8b5d176f6c30839db2b7f77e3dd4ce1c1cc436d892f7b4"} err="failed to get container status \"2dde25d2304a9d4a4c8b5d176f6c30839db2b7f77e3dd4ce1c1cc436d892f7b4\": rpc error: code = NotFound desc = could not find container \"2dde25d2304a9d4a4c8b5d176f6c30839db2b7f77e3dd4ce1c1cc436d892f7b4\": container with ID starting with 2dde25d2304a9d4a4c8b5d176f6c30839db2b7f77e3dd4ce1c1cc436d892f7b4 not found: ID does not exist" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.578928 4778 scope.go:117] "RemoveContainer" containerID="795bd6cd8d6b73df6fd951f45c7b195c275f3cc4fb67d9479fbde76a1aa93880" Dec 05 16:23:23 crc kubenswrapper[4778]: E1205 16:23:23.579382 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"795bd6cd8d6b73df6fd951f45c7b195c275f3cc4fb67d9479fbde76a1aa93880\": container with ID starting with 795bd6cd8d6b73df6fd951f45c7b195c275f3cc4fb67d9479fbde76a1aa93880 not found: ID does not exist" containerID="795bd6cd8d6b73df6fd951f45c7b195c275f3cc4fb67d9479fbde76a1aa93880" Dec 05 16:23:23 crc kubenswrapper[4778]: I1205 16:23:23.579414 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"795bd6cd8d6b73df6fd951f45c7b195c275f3cc4fb67d9479fbde76a1aa93880"} err="failed to get container status \"795bd6cd8d6b73df6fd951f45c7b195c275f3cc4fb67d9479fbde76a1aa93880\": rpc error: code = NotFound desc = could not find container \"795bd6cd8d6b73df6fd951f45c7b195c275f3cc4fb67d9479fbde76a1aa93880\": container with ID starting with 795bd6cd8d6b73df6fd951f45c7b195c275f3cc4fb67d9479fbde76a1aa93880 not found: ID does not exist" Dec 05 16:23:25 crc kubenswrapper[4778]: I1205 16:23:25.275503 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" path="/var/lib/kubelet/pods/e1162ce4-30c3-4ca9-aa3d-2899f8b151cd/volumes" Dec 05 16:23:34 crc kubenswrapper[4778]: I1205 16:23:34.250443 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:23:34 crc kubenswrapper[4778]: E1205 16:23:34.251241 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:23:45 crc kubenswrapper[4778]: I1205 16:23:45.249590 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:23:45 crc kubenswrapper[4778]: E1205 16:23:45.250361 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:23:56 crc kubenswrapper[4778]: I1205 16:23:56.249978 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:23:56 crc kubenswrapper[4778]: E1205 16:23:56.251252 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:24:08 crc kubenswrapper[4778]: I1205 16:24:08.250073 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:24:08 crc kubenswrapper[4778]: E1205 16:24:08.250909 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:24:20 crc kubenswrapper[4778]: I1205 16:24:20.250171 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:24:20 crc kubenswrapper[4778]: E1205 16:24:20.250937 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:24:34 crc kubenswrapper[4778]: I1205 16:24:34.250525 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:24:34 crc kubenswrapper[4778]: E1205 16:24:34.251290 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:24:47 crc kubenswrapper[4778]: I1205 16:24:47.136634 4778 scope.go:117] "RemoveContainer" containerID="18fc5d350d6ae1bbf416c9c3933499a56ae3a4cd5e84d67a95a739c1922899e6" Dec 05 16:24:49 crc kubenswrapper[4778]: I1205 16:24:49.254057 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:24:49 crc kubenswrapper[4778]: E1205 16:24:49.254936 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:25:01 crc kubenswrapper[4778]: I1205 16:25:01.250572 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:25:01 crc kubenswrapper[4778]: E1205 16:25:01.251858 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:25:16 crc kubenswrapper[4778]: I1205 16:25:16.250072 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:25:16 crc kubenswrapper[4778]: E1205 16:25:16.251022 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:25:29 crc kubenswrapper[4778]: I1205 16:25:29.249684 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:25:29 crc kubenswrapper[4778]: E1205 16:25:29.250445 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:25:42 crc kubenswrapper[4778]: I1205 16:25:42.249799 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:25:42 crc kubenswrapper[4778]: I1205 16:25:42.694630 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"07f810bc57816803590a70eb4acea4f15e7b3ff3f3449761e7328bfde876054a"} Dec 05 16:25:47 crc kubenswrapper[4778]: I1205 16:25:47.208820 4778 scope.go:117] "RemoveContainer" containerID="fcfbe6afb3bc84e41cb91c36beb0a756b1ad1d7ff69825fded05442a2a43afa7" Dec 05 16:25:47 crc kubenswrapper[4778]: I1205 16:25:47.233978 4778 scope.go:117] "RemoveContainer" containerID="dfc4c892a81342049d9ee0b5239096c2e5592c0c193dca1ea53413f885bd4a9b" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.598889 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-ghjd7"] Dec 05 16:25:53 crc kubenswrapper[4778]: E1205 16:25:53.599875 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" containerName="extract-utilities" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.599890 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" containerName="extract-utilities" Dec 05 16:25:53 crc kubenswrapper[4778]: E1205 16:25:53.599905 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" containerName="extract-content" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.599913 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" containerName="extract-content" Dec 05 16:25:53 crc kubenswrapper[4778]: E1205 16:25:53.599938 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" containerName="registry-server" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.599945 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" containerName="registry-server" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.600137 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1162ce4-30c3-4ca9-aa3d-2899f8b151cd" containerName="registry-server" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.600855 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-ghjd7" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.616472 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-ghjd7"] Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.665736 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca0344fc-bf5f-4b80-a634-857e0851ea08-operator-scripts\") pod \"watcher-db-create-ghjd7\" (UID: \"ca0344fc-bf5f-4b80-a634-857e0851ea08\") " pod="watcher-kuttl-default/watcher-db-create-ghjd7" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.665795 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7kkx\" (UniqueName: \"kubernetes.io/projected/ca0344fc-bf5f-4b80-a634-857e0851ea08-kube-api-access-n7kkx\") pod \"watcher-db-create-ghjd7\" (UID: \"ca0344fc-bf5f-4b80-a634-857e0851ea08\") " pod="watcher-kuttl-default/watcher-db-create-ghjd7" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.697126 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-e724-account-create-update-gtcbp"] Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.698611 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.704193 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.706129 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-e724-account-create-update-gtcbp"] Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.768202 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca0344fc-bf5f-4b80-a634-857e0851ea08-operator-scripts\") pod \"watcher-db-create-ghjd7\" (UID: \"ca0344fc-bf5f-4b80-a634-857e0851ea08\") " pod="watcher-kuttl-default/watcher-db-create-ghjd7" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.768591 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75425853-3067-431a-8a1c-49370e8c6516-operator-scripts\") pod \"watcher-e724-account-create-update-gtcbp\" (UID: \"75425853-3067-431a-8a1c-49370e8c6516\") " pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.768696 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7kkx\" (UniqueName: \"kubernetes.io/projected/ca0344fc-bf5f-4b80-a634-857e0851ea08-kube-api-access-n7kkx\") pod \"watcher-db-create-ghjd7\" (UID: \"ca0344fc-bf5f-4b80-a634-857e0851ea08\") " pod="watcher-kuttl-default/watcher-db-create-ghjd7" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.768805 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5rzp\" (UniqueName: \"kubernetes.io/projected/75425853-3067-431a-8a1c-49370e8c6516-kube-api-access-q5rzp\") pod \"watcher-e724-account-create-update-gtcbp\" (UID: \"75425853-3067-431a-8a1c-49370e8c6516\") " pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.769808 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca0344fc-bf5f-4b80-a634-857e0851ea08-operator-scripts\") pod \"watcher-db-create-ghjd7\" (UID: \"ca0344fc-bf5f-4b80-a634-857e0851ea08\") " pod="watcher-kuttl-default/watcher-db-create-ghjd7" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.793948 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7kkx\" (UniqueName: \"kubernetes.io/projected/ca0344fc-bf5f-4b80-a634-857e0851ea08-kube-api-access-n7kkx\") pod \"watcher-db-create-ghjd7\" (UID: \"ca0344fc-bf5f-4b80-a634-857e0851ea08\") " pod="watcher-kuttl-default/watcher-db-create-ghjd7" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.869735 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5rzp\" (UniqueName: \"kubernetes.io/projected/75425853-3067-431a-8a1c-49370e8c6516-kube-api-access-q5rzp\") pod \"watcher-e724-account-create-update-gtcbp\" (UID: \"75425853-3067-431a-8a1c-49370e8c6516\") " pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.870132 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75425853-3067-431a-8a1c-49370e8c6516-operator-scripts\") pod \"watcher-e724-account-create-update-gtcbp\" (UID: \"75425853-3067-431a-8a1c-49370e8c6516\") " pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.870813 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75425853-3067-431a-8a1c-49370e8c6516-operator-scripts\") pod \"watcher-e724-account-create-update-gtcbp\" (UID: \"75425853-3067-431a-8a1c-49370e8c6516\") " pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.885551 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5rzp\" (UniqueName: \"kubernetes.io/projected/75425853-3067-431a-8a1c-49370e8c6516-kube-api-access-q5rzp\") pod \"watcher-e724-account-create-update-gtcbp\" (UID: \"75425853-3067-431a-8a1c-49370e8c6516\") " pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" Dec 05 16:25:53 crc kubenswrapper[4778]: I1205 16:25:53.919734 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-ghjd7" Dec 05 16:25:54 crc kubenswrapper[4778]: I1205 16:25:54.014898 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" Dec 05 16:25:54 crc kubenswrapper[4778]: I1205 16:25:54.391162 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-ghjd7"] Dec 05 16:25:54 crc kubenswrapper[4778]: I1205 16:25:54.557838 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-e724-account-create-update-gtcbp"] Dec 05 16:25:54 crc kubenswrapper[4778]: W1205 16:25:54.560615 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75425853_3067_431a_8a1c_49370e8c6516.slice/crio-4f7b3dbff729b2078bea27e8a0ea455d9e53fe66b10fac60af0586981a7b7772 WatchSource:0}: Error finding container 4f7b3dbff729b2078bea27e8a0ea455d9e53fe66b10fac60af0586981a7b7772: Status 404 returned error can't find the container with id 4f7b3dbff729b2078bea27e8a0ea455d9e53fe66b10fac60af0586981a7b7772 Dec 05 16:25:54 crc kubenswrapper[4778]: I1205 16:25:54.807475 4778 generic.go:334] "Generic (PLEG): container finished" podID="ca0344fc-bf5f-4b80-a634-857e0851ea08" containerID="ef29064e18100bd35c52e7374d9e4d2ed0ac6790d0745d07b1ed7990591f9f95" exitCode=0 Dec 05 16:25:54 crc kubenswrapper[4778]: I1205 16:25:54.807618 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-ghjd7" event={"ID":"ca0344fc-bf5f-4b80-a634-857e0851ea08","Type":"ContainerDied","Data":"ef29064e18100bd35c52e7374d9e4d2ed0ac6790d0745d07b1ed7990591f9f95"} Dec 05 16:25:54 crc kubenswrapper[4778]: I1205 16:25:54.807782 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-ghjd7" event={"ID":"ca0344fc-bf5f-4b80-a634-857e0851ea08","Type":"ContainerStarted","Data":"8d91dd769486c55727eccbd20f302a575d17f4cb779342ebd3194fe060418c64"} Dec 05 16:25:54 crc kubenswrapper[4778]: I1205 16:25:54.809396 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" event={"ID":"75425853-3067-431a-8a1c-49370e8c6516","Type":"ContainerStarted","Data":"83d7affd3e3bfd6fd473509b0492f5babc93dd1363a510ade81e81a7eb0b1f54"} Dec 05 16:25:54 crc kubenswrapper[4778]: I1205 16:25:54.809419 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" event={"ID":"75425853-3067-431a-8a1c-49370e8c6516","Type":"ContainerStarted","Data":"4f7b3dbff729b2078bea27e8a0ea455d9e53fe66b10fac60af0586981a7b7772"} Dec 05 16:25:54 crc kubenswrapper[4778]: I1205 16:25:54.847056 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" podStartSLOduration=1.8470368640000001 podStartE2EDuration="1.847036864s" podCreationTimestamp="2025-12-05 16:25:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:25:54.842892384 +0000 UTC m=+1841.946688764" watchObservedRunningTime="2025-12-05 16:25:54.847036864 +0000 UTC m=+1841.950833244" Dec 05 16:25:55 crc kubenswrapper[4778]: I1205 16:25:55.821139 4778 generic.go:334] "Generic (PLEG): container finished" podID="75425853-3067-431a-8a1c-49370e8c6516" containerID="83d7affd3e3bfd6fd473509b0492f5babc93dd1363a510ade81e81a7eb0b1f54" exitCode=0 Dec 05 16:25:55 crc kubenswrapper[4778]: I1205 16:25:55.821198 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" event={"ID":"75425853-3067-431a-8a1c-49370e8c6516","Type":"ContainerDied","Data":"83d7affd3e3bfd6fd473509b0492f5babc93dd1363a510ade81e81a7eb0b1f54"} Dec 05 16:25:56 crc kubenswrapper[4778]: I1205 16:25:56.191056 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-ghjd7" Dec 05 16:25:56 crc kubenswrapper[4778]: I1205 16:25:56.313768 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca0344fc-bf5f-4b80-a634-857e0851ea08-operator-scripts\") pod \"ca0344fc-bf5f-4b80-a634-857e0851ea08\" (UID: \"ca0344fc-bf5f-4b80-a634-857e0851ea08\") " Dec 05 16:25:56 crc kubenswrapper[4778]: I1205 16:25:56.313872 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7kkx\" (UniqueName: \"kubernetes.io/projected/ca0344fc-bf5f-4b80-a634-857e0851ea08-kube-api-access-n7kkx\") pod \"ca0344fc-bf5f-4b80-a634-857e0851ea08\" (UID: \"ca0344fc-bf5f-4b80-a634-857e0851ea08\") " Dec 05 16:25:56 crc kubenswrapper[4778]: I1205 16:25:56.315225 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca0344fc-bf5f-4b80-a634-857e0851ea08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca0344fc-bf5f-4b80-a634-857e0851ea08" (UID: "ca0344fc-bf5f-4b80-a634-857e0851ea08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:25:56 crc kubenswrapper[4778]: I1205 16:25:56.320441 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0344fc-bf5f-4b80-a634-857e0851ea08-kube-api-access-n7kkx" (OuterVolumeSpecName: "kube-api-access-n7kkx") pod "ca0344fc-bf5f-4b80-a634-857e0851ea08" (UID: "ca0344fc-bf5f-4b80-a634-857e0851ea08"). InnerVolumeSpecName "kube-api-access-n7kkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:25:56 crc kubenswrapper[4778]: I1205 16:25:56.415336 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca0344fc-bf5f-4b80-a634-857e0851ea08-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:25:56 crc kubenswrapper[4778]: I1205 16:25:56.415400 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7kkx\" (UniqueName: \"kubernetes.io/projected/ca0344fc-bf5f-4b80-a634-857e0851ea08-kube-api-access-n7kkx\") on node \"crc\" DevicePath \"\"" Dec 05 16:25:56 crc kubenswrapper[4778]: I1205 16:25:56.831408 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-ghjd7" Dec 05 16:25:56 crc kubenswrapper[4778]: I1205 16:25:56.831429 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-ghjd7" event={"ID":"ca0344fc-bf5f-4b80-a634-857e0851ea08","Type":"ContainerDied","Data":"8d91dd769486c55727eccbd20f302a575d17f4cb779342ebd3194fe060418c64"} Dec 05 16:25:56 crc kubenswrapper[4778]: I1205 16:25:56.831478 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d91dd769486c55727eccbd20f302a575d17f4cb779342ebd3194fe060418c64" Dec 05 16:25:57 crc kubenswrapper[4778]: I1205 16:25:57.187252 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" Dec 05 16:25:57 crc kubenswrapper[4778]: I1205 16:25:57.233816 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75425853-3067-431a-8a1c-49370e8c6516-operator-scripts\") pod \"75425853-3067-431a-8a1c-49370e8c6516\" (UID: \"75425853-3067-431a-8a1c-49370e8c6516\") " Dec 05 16:25:57 crc kubenswrapper[4778]: I1205 16:25:57.233887 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5rzp\" (UniqueName: \"kubernetes.io/projected/75425853-3067-431a-8a1c-49370e8c6516-kube-api-access-q5rzp\") pod \"75425853-3067-431a-8a1c-49370e8c6516\" (UID: \"75425853-3067-431a-8a1c-49370e8c6516\") " Dec 05 16:25:57 crc kubenswrapper[4778]: I1205 16:25:57.234470 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75425853-3067-431a-8a1c-49370e8c6516-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75425853-3067-431a-8a1c-49370e8c6516" (UID: "75425853-3067-431a-8a1c-49370e8c6516"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:25:57 crc kubenswrapper[4778]: I1205 16:25:57.241725 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75425853-3067-431a-8a1c-49370e8c6516-kube-api-access-q5rzp" (OuterVolumeSpecName: "kube-api-access-q5rzp") pod "75425853-3067-431a-8a1c-49370e8c6516" (UID: "75425853-3067-431a-8a1c-49370e8c6516"). InnerVolumeSpecName "kube-api-access-q5rzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:25:57 crc kubenswrapper[4778]: I1205 16:25:57.336632 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75425853-3067-431a-8a1c-49370e8c6516-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:25:57 crc kubenswrapper[4778]: I1205 16:25:57.336671 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5rzp\" (UniqueName: \"kubernetes.io/projected/75425853-3067-431a-8a1c-49370e8c6516-kube-api-access-q5rzp\") on node \"crc\" DevicePath \"\"" Dec 05 16:25:57 crc kubenswrapper[4778]: I1205 16:25:57.841465 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" event={"ID":"75425853-3067-431a-8a1c-49370e8c6516","Type":"ContainerDied","Data":"4f7b3dbff729b2078bea27e8a0ea455d9e53fe66b10fac60af0586981a7b7772"} Dec 05 16:25:57 crc kubenswrapper[4778]: I1205 16:25:57.841511 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f7b3dbff729b2078bea27e8a0ea455d9e53fe66b10fac60af0586981a7b7772" Dec 05 16:25:57 crc kubenswrapper[4778]: I1205 16:25:57.841523 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-e724-account-create-update-gtcbp" Dec 05 16:25:58 crc kubenswrapper[4778]: I1205 16:25:58.937150 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm"] Dec 05 16:25:58 crc kubenswrapper[4778]: E1205 16:25:58.937805 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75425853-3067-431a-8a1c-49370e8c6516" containerName="mariadb-account-create-update" Dec 05 16:25:58 crc kubenswrapper[4778]: I1205 16:25:58.937819 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="75425853-3067-431a-8a1c-49370e8c6516" containerName="mariadb-account-create-update" Dec 05 16:25:58 crc kubenswrapper[4778]: E1205 16:25:58.937834 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0344fc-bf5f-4b80-a634-857e0851ea08" containerName="mariadb-database-create" Dec 05 16:25:58 crc kubenswrapper[4778]: I1205 16:25:58.937842 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0344fc-bf5f-4b80-a634-857e0851ea08" containerName="mariadb-database-create" Dec 05 16:25:58 crc kubenswrapper[4778]: I1205 16:25:58.938028 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="75425853-3067-431a-8a1c-49370e8c6516" containerName="mariadb-account-create-update" Dec 05 16:25:58 crc kubenswrapper[4778]: I1205 16:25:58.938062 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca0344fc-bf5f-4b80-a634-857e0851ea08" containerName="mariadb-database-create" Dec 05 16:25:58 crc kubenswrapper[4778]: I1205 16:25:58.938673 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:58 crc kubenswrapper[4778]: I1205 16:25:58.968003 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-pvkqz" Dec 05 16:25:58 crc kubenswrapper[4778]: I1205 16:25:58.970094 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 16:25:58 crc kubenswrapper[4778]: I1205 16:25:58.992038 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm"] Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.070475 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jkf6\" (UniqueName: \"kubernetes.io/projected/48692f6b-de46-4fc0-88c1-85eb3a003a63-kube-api-access-7jkf6\") pod \"watcher-kuttl-db-sync-wbpzm\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.070526 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-wbpzm\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.070622 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-db-sync-config-data\") pod \"watcher-kuttl-db-sync-wbpzm\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.070644 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-config-data\") pod \"watcher-kuttl-db-sync-wbpzm\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.171889 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-wbpzm\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.172000 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-db-sync-config-data\") pod \"watcher-kuttl-db-sync-wbpzm\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.172047 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-config-data\") pod \"watcher-kuttl-db-sync-wbpzm\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.172218 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jkf6\" (UniqueName: \"kubernetes.io/projected/48692f6b-de46-4fc0-88c1-85eb3a003a63-kube-api-access-7jkf6\") pod \"watcher-kuttl-db-sync-wbpzm\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.176695 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-db-sync-config-data\") pod \"watcher-kuttl-db-sync-wbpzm\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.177655 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-config-data\") pod \"watcher-kuttl-db-sync-wbpzm\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.178018 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-wbpzm\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.198179 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jkf6\" (UniqueName: \"kubernetes.io/projected/48692f6b-de46-4fc0-88c1-85eb3a003a63-kube-api-access-7jkf6\") pod \"watcher-kuttl-db-sync-wbpzm\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.287636 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.744672 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm"] Dec 05 16:25:59 crc kubenswrapper[4778]: W1205 16:25:59.745182 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48692f6b_de46_4fc0_88c1_85eb3a003a63.slice/crio-1a98083e51361d5910056f44618b80dff8e798dfc08955ebb78aa54ee6db977e WatchSource:0}: Error finding container 1a98083e51361d5910056f44618b80dff8e798dfc08955ebb78aa54ee6db977e: Status 404 returned error can't find the container with id 1a98083e51361d5910056f44618b80dff8e798dfc08955ebb78aa54ee6db977e Dec 05 16:25:59 crc kubenswrapper[4778]: I1205 16:25:59.860841 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" event={"ID":"48692f6b-de46-4fc0-88c1-85eb3a003a63","Type":"ContainerStarted","Data":"1a98083e51361d5910056f44618b80dff8e798dfc08955ebb78aa54ee6db977e"} Dec 05 16:26:00 crc kubenswrapper[4778]: I1205 16:26:00.870583 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" event={"ID":"48692f6b-de46-4fc0-88c1-85eb3a003a63","Type":"ContainerStarted","Data":"925000619d53bbe63a1831dd13ba622e0aad205d8110b802c077c8630669cf4f"} Dec 05 16:26:02 crc kubenswrapper[4778]: I1205 16:26:02.887893 4778 generic.go:334] "Generic (PLEG): container finished" podID="48692f6b-de46-4fc0-88c1-85eb3a003a63" containerID="925000619d53bbe63a1831dd13ba622e0aad205d8110b802c077c8630669cf4f" exitCode=0 Dec 05 16:26:02 crc kubenswrapper[4778]: I1205 16:26:02.887996 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" event={"ID":"48692f6b-de46-4fc0-88c1-85eb3a003a63","Type":"ContainerDied","Data":"925000619d53bbe63a1831dd13ba622e0aad205d8110b802c077c8630669cf4f"} Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.270751 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.354901 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-db-sync-config-data\") pod \"48692f6b-de46-4fc0-88c1-85eb3a003a63\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.355122 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jkf6\" (UniqueName: \"kubernetes.io/projected/48692f6b-de46-4fc0-88c1-85eb3a003a63-kube-api-access-7jkf6\") pod \"48692f6b-de46-4fc0-88c1-85eb3a003a63\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.355157 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-config-data\") pod \"48692f6b-de46-4fc0-88c1-85eb3a003a63\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.355184 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-combined-ca-bundle\") pod \"48692f6b-de46-4fc0-88c1-85eb3a003a63\" (UID: \"48692f6b-de46-4fc0-88c1-85eb3a003a63\") " Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.360500 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48692f6b-de46-4fc0-88c1-85eb3a003a63-kube-api-access-7jkf6" (OuterVolumeSpecName: "kube-api-access-7jkf6") pod "48692f6b-de46-4fc0-88c1-85eb3a003a63" (UID: "48692f6b-de46-4fc0-88c1-85eb3a003a63"). InnerVolumeSpecName "kube-api-access-7jkf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.371698 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "48692f6b-de46-4fc0-88c1-85eb3a003a63" (UID: "48692f6b-de46-4fc0-88c1-85eb3a003a63"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.379300 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48692f6b-de46-4fc0-88c1-85eb3a003a63" (UID: "48692f6b-de46-4fc0-88c1-85eb3a003a63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.394215 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-config-data" (OuterVolumeSpecName: "config-data") pod "48692f6b-de46-4fc0-88c1-85eb3a003a63" (UID: "48692f6b-de46-4fc0-88c1-85eb3a003a63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.457228 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jkf6\" (UniqueName: \"kubernetes.io/projected/48692f6b-de46-4fc0-88c1-85eb3a003a63-kube-api-access-7jkf6\") on node \"crc\" DevicePath \"\"" Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.457259 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.457275 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.457286 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48692f6b-de46-4fc0-88c1-85eb3a003a63-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.908080 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" event={"ID":"48692f6b-de46-4fc0-88c1-85eb3a003a63","Type":"ContainerDied","Data":"1a98083e51361d5910056f44618b80dff8e798dfc08955ebb78aa54ee6db977e"} Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.908516 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a98083e51361d5910056f44618b80dff8e798dfc08955ebb78aa54ee6db977e" Dec 05 16:26:04 crc kubenswrapper[4778]: I1205 16:26:04.908170 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.179077 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:26:05 crc kubenswrapper[4778]: E1205 16:26:05.179524 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48692f6b-de46-4fc0-88c1-85eb3a003a63" containerName="watcher-kuttl-db-sync" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.179541 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="48692f6b-de46-4fc0-88c1-85eb3a003a63" containerName="watcher-kuttl-db-sync" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.179754 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="48692f6b-de46-4fc0-88c1-85eb3a003a63" containerName="watcher-kuttl-db-sync" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.180929 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.186485 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.186690 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.187644 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.190768 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-pvkqz" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.193855 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.195221 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.200262 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.205809 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.234009 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.262702 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.264156 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.264324 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.268526 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.275552 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.275602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.275624 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.275643 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.275679 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.275706 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22twg\" (UniqueName: \"kubernetes.io/projected/b3d171fd-e9d8-4778-917f-ccfad7c27404-kube-api-access-22twg\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.275731 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e8805fe-ca05-4473-a495-51825049a597-logs\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.275764 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.275782 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.275797 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.275820 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3d171fd-e9d8-4778-917f-ccfad7c27404-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.275837 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cns2f\" (UniqueName: \"kubernetes.io/projected/0e8805fe-ca05-4473-a495-51825049a597-kube-api-access-cns2f\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377604 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828b1128-474f-4ce3-a67d-b0f9ac493824-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377652 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/828b1128-474f-4ce3-a67d-b0f9ac493824-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377695 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377749 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377767 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377793 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3d171fd-e9d8-4778-917f-ccfad7c27404-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377808 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cns2f\" (UniqueName: \"kubernetes.io/projected/0e8805fe-ca05-4473-a495-51825049a597-kube-api-access-cns2f\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377850 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828b1128-474f-4ce3-a67d-b0f9ac493824-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377875 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377898 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377923 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377939 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377954 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-585xh\" (UniqueName: \"kubernetes.io/projected/828b1128-474f-4ce3-a67d-b0f9ac493824-kube-api-access-585xh\") pod \"watcher-kuttl-applier-0\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.377988 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.378014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22twg\" (UniqueName: \"kubernetes.io/projected/b3d171fd-e9d8-4778-917f-ccfad7c27404-kube-api-access-22twg\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.378036 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e8805fe-ca05-4473-a495-51825049a597-logs\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.378398 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e8805fe-ca05-4473-a495-51825049a597-logs\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.378741 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3d171fd-e9d8-4778-917f-ccfad7c27404-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.383227 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.390851 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.391672 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.392421 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.393045 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.395395 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.395468 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cns2f\" (UniqueName: \"kubernetes.io/projected/0e8805fe-ca05-4473-a495-51825049a597-kube-api-access-cns2f\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.396333 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.400978 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22twg\" (UniqueName: \"kubernetes.io/projected/b3d171fd-e9d8-4778-917f-ccfad7c27404-kube-api-access-22twg\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.401273 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.480002 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828b1128-474f-4ce3-a67d-b0f9ac493824-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.480416 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/828b1128-474f-4ce3-a67d-b0f9ac493824-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.480504 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828b1128-474f-4ce3-a67d-b0f9ac493824-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.480552 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-585xh\" (UniqueName: \"kubernetes.io/projected/828b1128-474f-4ce3-a67d-b0f9ac493824-kube-api-access-585xh\") pod \"watcher-kuttl-applier-0\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.480965 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/828b1128-474f-4ce3-a67d-b0f9ac493824-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.484817 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828b1128-474f-4ce3-a67d-b0f9ac493824-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.485050 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828b1128-474f-4ce3-a67d-b0f9ac493824-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.500168 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-585xh\" (UniqueName: \"kubernetes.io/projected/828b1128-474f-4ce3-a67d-b0f9ac493824-kube-api-access-585xh\") pod \"watcher-kuttl-applier-0\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.510157 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.521489 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.589738 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.826862 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.916400 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"0e8805fe-ca05-4473-a495-51825049a597","Type":"ContainerStarted","Data":"13a7323d690b0903a332b5214cbbaec26482f6f4e79a1870a0cd5fb31e5a7bdc"} Dec 05 16:26:05 crc kubenswrapper[4778]: I1205 16:26:05.982134 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:26:05 crc kubenswrapper[4778]: W1205 16:26:05.990309 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3d171fd_e9d8_4778_917f_ccfad7c27404.slice/crio-c3b10391d04d3a880f10a8d6cd0da35a9bdb486f85e5e3dcd58da4536702fb14 WatchSource:0}: Error finding container c3b10391d04d3a880f10a8d6cd0da35a9bdb486f85e5e3dcd58da4536702fb14: Status 404 returned error can't find the container with id c3b10391d04d3a880f10a8d6cd0da35a9bdb486f85e5e3dcd58da4536702fb14 Dec 05 16:26:06 crc kubenswrapper[4778]: I1205 16:26:06.140259 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:26:06 crc kubenswrapper[4778]: W1205 16:26:06.153500 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod828b1128_474f_4ce3_a67d_b0f9ac493824.slice/crio-3922efa06ca427c639b8f4a3a46726d71667cd834dfd50a5862d1a02c49ced25 WatchSource:0}: Error finding container 3922efa06ca427c639b8f4a3a46726d71667cd834dfd50a5862d1a02c49ced25: Status 404 returned error can't find the container with id 3922efa06ca427c639b8f4a3a46726d71667cd834dfd50a5862d1a02c49ced25 Dec 05 16:26:06 crc kubenswrapper[4778]: I1205 16:26:06.926651 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerStarted","Data":"0b9de141242556e8ec091049be3a3266fb66807451c69f288412a25f441d5b73"} Dec 05 16:26:06 crc kubenswrapper[4778]: I1205 16:26:06.926910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerStarted","Data":"c3b10391d04d3a880f10a8d6cd0da35a9bdb486f85e5e3dcd58da4536702fb14"} Dec 05 16:26:06 crc kubenswrapper[4778]: I1205 16:26:06.928745 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"828b1128-474f-4ce3-a67d-b0f9ac493824","Type":"ContainerStarted","Data":"77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a"} Dec 05 16:26:06 crc kubenswrapper[4778]: I1205 16:26:06.928794 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"828b1128-474f-4ce3-a67d-b0f9ac493824","Type":"ContainerStarted","Data":"3922efa06ca427c639b8f4a3a46726d71667cd834dfd50a5862d1a02c49ced25"} Dec 05 16:26:06 crc kubenswrapper[4778]: I1205 16:26:06.931029 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"0e8805fe-ca05-4473-a495-51825049a597","Type":"ContainerStarted","Data":"4d7ffc071fc8e981492fa34ef8460623f6088a38943ae8b307b43d627d7ab2ba"} Dec 05 16:26:06 crc kubenswrapper[4778]: I1205 16:26:06.931161 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"0e8805fe-ca05-4473-a495-51825049a597","Type":"ContainerStarted","Data":"92f1c32d521a3ad41b7358af7fd8f9efca8bd5c33ea07ba39afbd9efdcc0d083"} Dec 05 16:26:06 crc kubenswrapper[4778]: I1205 16:26:06.933785 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:06 crc kubenswrapper[4778]: I1205 16:26:06.961012 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.9609915679999999 podStartE2EDuration="1.960991568s" podCreationTimestamp="2025-12-05 16:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:26:06.951444474 +0000 UTC m=+1854.055240894" watchObservedRunningTime="2025-12-05 16:26:06.960991568 +0000 UTC m=+1854.064787958" Dec 05 16:26:06 crc kubenswrapper[4778]: I1205 16:26:06.990743 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=1.9898270980000001 podStartE2EDuration="1.989827098s" podCreationTimestamp="2025-12-05 16:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:26:06.969090774 +0000 UTC m=+1854.072887174" watchObservedRunningTime="2025-12-05 16:26:06.989827098 +0000 UTC m=+1854.093623488" Dec 05 16:26:06 crc kubenswrapper[4778]: I1205 16:26:06.998415 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.998400476 podStartE2EDuration="1.998400476s" podCreationTimestamp="2025-12-05 16:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:26:06.98579364 +0000 UTC m=+1854.089590020" watchObservedRunningTime="2025-12-05 16:26:06.998400476 +0000 UTC m=+1854.102196856" Dec 05 16:26:08 crc kubenswrapper[4778]: I1205 16:26:08.947077 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 16:26:09 crc kubenswrapper[4778]: I1205 16:26:09.095150 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:09 crc kubenswrapper[4778]: I1205 16:26:09.962429 4778 generic.go:334] "Generic (PLEG): container finished" podID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerID="0b9de141242556e8ec091049be3a3266fb66807451c69f288412a25f441d5b73" exitCode=1 Dec 05 16:26:09 crc kubenswrapper[4778]: I1205 16:26:09.962512 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerDied","Data":"0b9de141242556e8ec091049be3a3266fb66807451c69f288412a25f441d5b73"} Dec 05 16:26:09 crc kubenswrapper[4778]: I1205 16:26:09.963748 4778 scope.go:117] "RemoveContainer" containerID="0b9de141242556e8ec091049be3a3266fb66807451c69f288412a25f441d5b73" Dec 05 16:26:10 crc kubenswrapper[4778]: I1205 16:26:10.511308 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:10 crc kubenswrapper[4778]: I1205 16:26:10.589922 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:10 crc kubenswrapper[4778]: I1205 16:26:10.972323 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerStarted","Data":"61108817d5f1077af3ba7599452bc823d45e3a0f82207f17d7e2ee59a4f192d2"} Dec 05 16:26:12 crc kubenswrapper[4778]: I1205 16:26:12.992854 4778 generic.go:334] "Generic (PLEG): container finished" podID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerID="61108817d5f1077af3ba7599452bc823d45e3a0f82207f17d7e2ee59a4f192d2" exitCode=1 Dec 05 16:26:12 crc kubenswrapper[4778]: I1205 16:26:12.992915 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerDied","Data":"61108817d5f1077af3ba7599452bc823d45e3a0f82207f17d7e2ee59a4f192d2"} Dec 05 16:26:12 crc kubenswrapper[4778]: I1205 16:26:12.992996 4778 scope.go:117] "RemoveContainer" containerID="0b9de141242556e8ec091049be3a3266fb66807451c69f288412a25f441d5b73" Dec 05 16:26:12 crc kubenswrapper[4778]: I1205 16:26:12.993554 4778 scope.go:117] "RemoveContainer" containerID="61108817d5f1077af3ba7599452bc823d45e3a0f82207f17d7e2ee59a4f192d2" Dec 05 16:26:12 crc kubenswrapper[4778]: E1205 16:26:12.993901 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:26:15 crc kubenswrapper[4778]: I1205 16:26:15.510849 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:15 crc kubenswrapper[4778]: I1205 16:26:15.519079 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:15 crc kubenswrapper[4778]: I1205 16:26:15.522078 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:15 crc kubenswrapper[4778]: I1205 16:26:15.522128 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:15 crc kubenswrapper[4778]: I1205 16:26:15.522748 4778 scope.go:117] "RemoveContainer" containerID="61108817d5f1077af3ba7599452bc823d45e3a0f82207f17d7e2ee59a4f192d2" Dec 05 16:26:15 crc kubenswrapper[4778]: E1205 16:26:15.523147 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:26:15 crc kubenswrapper[4778]: I1205 16:26:15.590813 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:15 crc kubenswrapper[4778]: I1205 16:26:15.619976 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:16 crc kubenswrapper[4778]: I1205 16:26:16.019625 4778 scope.go:117] "RemoveContainer" containerID="61108817d5f1077af3ba7599452bc823d45e3a0f82207f17d7e2ee59a4f192d2" Dec 05 16:26:16 crc kubenswrapper[4778]: E1205 16:26:16.020334 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:26:16 crc kubenswrapper[4778]: I1205 16:26:16.029950 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:26:16 crc kubenswrapper[4778]: I1205 16:26:16.043989 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:26:27 crc kubenswrapper[4778]: I1205 16:26:27.250347 4778 scope.go:117] "RemoveContainer" containerID="61108817d5f1077af3ba7599452bc823d45e3a0f82207f17d7e2ee59a4f192d2" Dec 05 16:26:28 crc kubenswrapper[4778]: I1205 16:26:28.143746 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerStarted","Data":"b90a843b0ad5d7970b25a8324b813dc846f63f063870e1707836d6678da8c12a"} Dec 05 16:26:31 crc kubenswrapper[4778]: I1205 16:26:31.171481 4778 generic.go:334] "Generic (PLEG): container finished" podID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerID="b90a843b0ad5d7970b25a8324b813dc846f63f063870e1707836d6678da8c12a" exitCode=1 Dec 05 16:26:31 crc kubenswrapper[4778]: I1205 16:26:31.171541 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerDied","Data":"b90a843b0ad5d7970b25a8324b813dc846f63f063870e1707836d6678da8c12a"} Dec 05 16:26:31 crc kubenswrapper[4778]: I1205 16:26:31.172077 4778 scope.go:117] "RemoveContainer" containerID="61108817d5f1077af3ba7599452bc823d45e3a0f82207f17d7e2ee59a4f192d2" Dec 05 16:26:31 crc kubenswrapper[4778]: I1205 16:26:31.172749 4778 scope.go:117] "RemoveContainer" containerID="b90a843b0ad5d7970b25a8324b813dc846f63f063870e1707836d6678da8c12a" Dec 05 16:26:31 crc kubenswrapper[4778]: E1205 16:26:31.173062 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:26:35 crc kubenswrapper[4778]: I1205 16:26:35.522205 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:35 crc kubenswrapper[4778]: I1205 16:26:35.522781 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:35 crc kubenswrapper[4778]: I1205 16:26:35.522796 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:35 crc kubenswrapper[4778]: I1205 16:26:35.522808 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:26:35 crc kubenswrapper[4778]: I1205 16:26:35.523397 4778 scope.go:117] "RemoveContainer" containerID="b90a843b0ad5d7970b25a8324b813dc846f63f063870e1707836d6678da8c12a" Dec 05 16:26:35 crc kubenswrapper[4778]: E1205 16:26:35.523605 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:26:37 crc kubenswrapper[4778]: E1205 16:26:37.138297 4778 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.130:55196->38.102.83.130:42485: write tcp 38.102.83.130:55196->38.102.83.130:42485: write: broken pipe Dec 05 16:26:47 crc kubenswrapper[4778]: I1205 16:26:47.323168 4778 scope.go:117] "RemoveContainer" containerID="f0823131a3b33ee75eea465cae545dbb67d8936f66dd660799b6e6b558458d2c" Dec 05 16:26:47 crc kubenswrapper[4778]: I1205 16:26:47.369957 4778 scope.go:117] "RemoveContainer" containerID="25c66e645ce069181d360caddd549661fd3cc12764529f300fa4c3f33e7f104d" Dec 05 16:26:47 crc kubenswrapper[4778]: I1205 16:26:47.396410 4778 scope.go:117] "RemoveContainer" containerID="8697caa8b628fa542a4f77cad65065fb6e0dcb92725f5221ba626624b12ba613" Dec 05 16:26:47 crc kubenswrapper[4778]: I1205 16:26:47.421108 4778 scope.go:117] "RemoveContainer" containerID="63751f32c9e71d97bf22cce360c68eaf72a1a6f2ef0671e11fd2d3827ffb489b" Dec 05 16:26:47 crc kubenswrapper[4778]: I1205 16:26:47.482699 4778 scope.go:117] "RemoveContainer" containerID="6035b9073a3ba4c4af735b261e3830b18b41f9149b595c5df4211feb1177d47c" Dec 05 16:26:49 crc kubenswrapper[4778]: I1205 16:26:49.249941 4778 scope.go:117] "RemoveContainer" containerID="b90a843b0ad5d7970b25a8324b813dc846f63f063870e1707836d6678da8c12a" Dec 05 16:26:49 crc kubenswrapper[4778]: E1205 16:26:49.250511 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:27:00 crc kubenswrapper[4778]: I1205 16:27:00.249623 4778 scope.go:117] "RemoveContainer" containerID="b90a843b0ad5d7970b25a8324b813dc846f63f063870e1707836d6678da8c12a" Dec 05 16:27:01 crc kubenswrapper[4778]: I1205 16:27:01.416641 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerStarted","Data":"da53ce8e15e8aea78a5a12ce5b4b674c74bcda1faa8637434893aa7214876404"} Dec 05 16:27:03 crc kubenswrapper[4778]: I1205 16:27:03.446645 4778 generic.go:334] "Generic (PLEG): container finished" podID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerID="da53ce8e15e8aea78a5a12ce5b4b674c74bcda1faa8637434893aa7214876404" exitCode=1 Dec 05 16:27:03 crc kubenswrapper[4778]: I1205 16:27:03.446738 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerDied","Data":"da53ce8e15e8aea78a5a12ce5b4b674c74bcda1faa8637434893aa7214876404"} Dec 05 16:27:03 crc kubenswrapper[4778]: I1205 16:27:03.446829 4778 scope.go:117] "RemoveContainer" containerID="b90a843b0ad5d7970b25a8324b813dc846f63f063870e1707836d6678da8c12a" Dec 05 16:27:03 crc kubenswrapper[4778]: I1205 16:27:03.447619 4778 scope.go:117] "RemoveContainer" containerID="da53ce8e15e8aea78a5a12ce5b4b674c74bcda1faa8637434893aa7214876404" Dec 05 16:27:03 crc kubenswrapper[4778]: E1205 16:27:03.447897 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:27:05 crc kubenswrapper[4778]: I1205 16:27:05.522330 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:27:05 crc kubenswrapper[4778]: I1205 16:27:05.522641 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:27:05 crc kubenswrapper[4778]: I1205 16:27:05.522656 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:27:05 crc kubenswrapper[4778]: I1205 16:27:05.522668 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:27:05 crc kubenswrapper[4778]: I1205 16:27:05.523285 4778 scope.go:117] "RemoveContainer" containerID="da53ce8e15e8aea78a5a12ce5b4b674c74bcda1faa8637434893aa7214876404" Dec 05 16:27:05 crc kubenswrapper[4778]: E1205 16:27:05.523539 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:27:20 crc kubenswrapper[4778]: I1205 16:27:20.055765 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q"] Dec 05 16:27:20 crc kubenswrapper[4778]: I1205 16:27:20.062644 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-db-create-xt6b2"] Dec 05 16:27:20 crc kubenswrapper[4778]: I1205 16:27:20.072414 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-db-create-xt6b2"] Dec 05 16:27:20 crc kubenswrapper[4778]: I1205 16:27:20.078856 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-cf4a-account-create-update-7wh4q"] Dec 05 16:27:21 crc kubenswrapper[4778]: I1205 16:27:21.249504 4778 scope.go:117] "RemoveContainer" containerID="da53ce8e15e8aea78a5a12ce5b4b674c74bcda1faa8637434893aa7214876404" Dec 05 16:27:21 crc kubenswrapper[4778]: E1205 16:27:21.249766 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:27:21 crc kubenswrapper[4778]: I1205 16:27:21.268164 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c3c6ad9-282f-408e-af9d-c4053b6c5ddf" path="/var/lib/kubelet/pods/3c3c6ad9-282f-408e-af9d-c4053b6c5ddf/volumes" Dec 05 16:27:21 crc kubenswrapper[4778]: I1205 16:27:21.269644 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63653c0c-59aa-47e0-8748-dd487c207a03" path="/var/lib/kubelet/pods/63653c0c-59aa-47e0-8748-dd487c207a03/volumes" Dec 05 16:27:34 crc kubenswrapper[4778]: I1205 16:27:34.249561 4778 scope.go:117] "RemoveContainer" containerID="da53ce8e15e8aea78a5a12ce5b4b674c74bcda1faa8637434893aa7214876404" Dec 05 16:27:34 crc kubenswrapper[4778]: E1205 16:27:34.251616 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:27:35 crc kubenswrapper[4778]: E1205 16:27:35.620436 4778 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.130:32822->38.102.83.130:42485: read tcp 38.102.83.130:32822->38.102.83.130:42485: read: connection reset by peer Dec 05 16:27:47 crc kubenswrapper[4778]: I1205 16:27:47.249973 4778 scope.go:117] "RemoveContainer" containerID="da53ce8e15e8aea78a5a12ce5b4b674c74bcda1faa8637434893aa7214876404" Dec 05 16:27:47 crc kubenswrapper[4778]: I1205 16:27:47.593289 4778 scope.go:117] "RemoveContainer" containerID="18e7a51224dad800e7e180f4b2341dde27c565a5a3bf48e60cb662f7dedd18fd" Dec 05 16:27:47 crc kubenswrapper[4778]: I1205 16:27:47.620413 4778 scope.go:117] "RemoveContainer" containerID="608c15605560825a423471eafd8c2268c1fae110483ad831b002deeeb4b3f41b" Dec 05 16:27:47 crc kubenswrapper[4778]: I1205 16:27:47.695489 4778 scope.go:117] "RemoveContainer" containerID="65a9ad3dc7d5495c002b0f8976b4c53e8ad1e97cfad9e600c934cb87a0468ca6" Dec 05 16:27:48 crc kubenswrapper[4778]: I1205 16:27:48.819409 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerStarted","Data":"568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072"} Dec 05 16:27:51 crc kubenswrapper[4778]: I1205 16:27:51.844584 4778 generic.go:334] "Generic (PLEG): container finished" podID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerID="568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072" exitCode=1 Dec 05 16:27:51 crc kubenswrapper[4778]: I1205 16:27:51.844677 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerDied","Data":"568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072"} Dec 05 16:27:51 crc kubenswrapper[4778]: I1205 16:27:51.844961 4778 scope.go:117] "RemoveContainer" containerID="da53ce8e15e8aea78a5a12ce5b4b674c74bcda1faa8637434893aa7214876404" Dec 05 16:27:51 crc kubenswrapper[4778]: I1205 16:27:51.845782 4778 scope.go:117] "RemoveContainer" containerID="568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072" Dec 05 16:27:51 crc kubenswrapper[4778]: E1205 16:27:51.846138 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:27:55 crc kubenswrapper[4778]: I1205 16:27:55.522982 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:27:55 crc kubenswrapper[4778]: I1205 16:27:55.523611 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:27:55 crc kubenswrapper[4778]: I1205 16:27:55.524217 4778 scope.go:117] "RemoveContainer" containerID="568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072" Dec 05 16:27:55 crc kubenswrapper[4778]: E1205 16:27:55.524454 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:28:03 crc kubenswrapper[4778]: I1205 16:28:03.415105 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:28:03 crc kubenswrapper[4778]: I1205 16:28:03.415630 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:28:05 crc kubenswrapper[4778]: I1205 16:28:05.522175 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:28:05 crc kubenswrapper[4778]: I1205 16:28:05.522542 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:28:05 crc kubenswrapper[4778]: I1205 16:28:05.523483 4778 scope.go:117] "RemoveContainer" containerID="568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072" Dec 05 16:28:05 crc kubenswrapper[4778]: E1205 16:28:05.523740 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:28:11 crc kubenswrapper[4778]: I1205 16:28:11.049256 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-5996h"] Dec 05 16:28:11 crc kubenswrapper[4778]: I1205 16:28:11.060652 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-5996h"] Dec 05 16:28:11 crc kubenswrapper[4778]: I1205 16:28:11.262791 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94083ec4-63e0-44b2-9181-808017479ef8" path="/var/lib/kubelet/pods/94083ec4-63e0-44b2-9181-808017479ef8/volumes" Dec 05 16:28:18 crc kubenswrapper[4778]: I1205 16:28:18.249812 4778 scope.go:117] "RemoveContainer" containerID="568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072" Dec 05 16:28:18 crc kubenswrapper[4778]: E1205 16:28:18.250635 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:28:30 crc kubenswrapper[4778]: I1205 16:28:30.033226 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-fsmts"] Dec 05 16:28:30 crc kubenswrapper[4778]: I1205 16:28:30.045223 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-fsmts"] Dec 05 16:28:30 crc kubenswrapper[4778]: I1205 16:28:30.250591 4778 scope.go:117] "RemoveContainer" containerID="568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072" Dec 05 16:28:30 crc kubenswrapper[4778]: E1205 16:28:30.250885 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:28:31 crc kubenswrapper[4778]: I1205 16:28:31.260754 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd92a3b6-617a-4962-b18c-50ab46cbfe54" path="/var/lib/kubelet/pods/cd92a3b6-617a-4962-b18c-50ab46cbfe54/volumes" Dec 05 16:28:33 crc kubenswrapper[4778]: I1205 16:28:33.414876 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:28:33 crc kubenswrapper[4778]: I1205 16:28:33.415246 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.098111 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tprfd"] Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.101574 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.110261 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tprfd"] Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.173962 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-utilities\") pod \"certified-operators-tprfd\" (UID: \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\") " pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.174050 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-catalog-content\") pod \"certified-operators-tprfd\" (UID: \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\") " pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.174089 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfqr5\" (UniqueName: \"kubernetes.io/projected/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-kube-api-access-wfqr5\") pod \"certified-operators-tprfd\" (UID: \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\") " pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.275657 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-utilities\") pod \"certified-operators-tprfd\" (UID: \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\") " pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.275723 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-catalog-content\") pod \"certified-operators-tprfd\" (UID: \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\") " pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.275758 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfqr5\" (UniqueName: \"kubernetes.io/projected/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-kube-api-access-wfqr5\") pod \"certified-operators-tprfd\" (UID: \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\") " pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.276348 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-catalog-content\") pod \"certified-operators-tprfd\" (UID: \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\") " pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.276470 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-utilities\") pod \"certified-operators-tprfd\" (UID: \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\") " pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.296401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfqr5\" (UniqueName: \"kubernetes.io/projected/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-kube-api-access-wfqr5\") pod \"certified-operators-tprfd\" (UID: \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\") " pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:37 crc kubenswrapper[4778]: I1205 16:28:37.422999 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:38 crc kubenswrapper[4778]: I1205 16:28:38.002847 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tprfd"] Dec 05 16:28:38 crc kubenswrapper[4778]: I1205 16:28:38.233059 4778 generic.go:334] "Generic (PLEG): container finished" podID="f3e7d018-2ff1-43d3-ba88-2929e3b98be5" containerID="8da5110225a563df1ac74603cdbf1abb39fe2fd4430a8e3a73da41298ded5b03" exitCode=0 Dec 05 16:28:38 crc kubenswrapper[4778]: I1205 16:28:38.233298 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tprfd" event={"ID":"f3e7d018-2ff1-43d3-ba88-2929e3b98be5","Type":"ContainerDied","Data":"8da5110225a563df1ac74603cdbf1abb39fe2fd4430a8e3a73da41298ded5b03"} Dec 05 16:28:38 crc kubenswrapper[4778]: I1205 16:28:38.233395 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tprfd" event={"ID":"f3e7d018-2ff1-43d3-ba88-2929e3b98be5","Type":"ContainerStarted","Data":"f2ae55d1162afe21486be6634dbbdc812e8a6e63397ef401ba6967149e3b136d"} Dec 05 16:28:38 crc kubenswrapper[4778]: I1205 16:28:38.235230 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:28:39 crc kubenswrapper[4778]: I1205 16:28:39.244039 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tprfd" event={"ID":"f3e7d018-2ff1-43d3-ba88-2929e3b98be5","Type":"ContainerStarted","Data":"27db4ef0df8aef410d2e84f83683f04733278511557dcd59d9b12823668acbd2"} Dec 05 16:28:40 crc kubenswrapper[4778]: I1205 16:28:40.275678 4778 generic.go:334] "Generic (PLEG): container finished" podID="f3e7d018-2ff1-43d3-ba88-2929e3b98be5" containerID="27db4ef0df8aef410d2e84f83683f04733278511557dcd59d9b12823668acbd2" exitCode=0 Dec 05 16:28:40 crc kubenswrapper[4778]: I1205 16:28:40.275733 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tprfd" event={"ID":"f3e7d018-2ff1-43d3-ba88-2929e3b98be5","Type":"ContainerDied","Data":"27db4ef0df8aef410d2e84f83683f04733278511557dcd59d9b12823668acbd2"} Dec 05 16:28:42 crc kubenswrapper[4778]: I1205 16:28:42.250401 4778 scope.go:117] "RemoveContainer" containerID="568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072" Dec 05 16:28:42 crc kubenswrapper[4778]: E1205 16:28:42.250946 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:28:42 crc kubenswrapper[4778]: I1205 16:28:42.296614 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tprfd" event={"ID":"f3e7d018-2ff1-43d3-ba88-2929e3b98be5","Type":"ContainerStarted","Data":"9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3"} Dec 05 16:28:42 crc kubenswrapper[4778]: I1205 16:28:42.314481 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tprfd" podStartSLOduration=1.9039605179999999 podStartE2EDuration="5.31446003s" podCreationTimestamp="2025-12-05 16:28:37 +0000 UTC" firstStartedPulling="2025-12-05 16:28:38.235018336 +0000 UTC m=+2005.338814716" lastFinishedPulling="2025-12-05 16:28:41.645517848 +0000 UTC m=+2008.749314228" observedRunningTime="2025-12-05 16:28:42.310545835 +0000 UTC m=+2009.414342235" watchObservedRunningTime="2025-12-05 16:28:42.31446003 +0000 UTC m=+2009.418256410" Dec 05 16:28:47 crc kubenswrapper[4778]: I1205 16:28:47.424132 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:47 crc kubenswrapper[4778]: I1205 16:28:47.424670 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:47 crc kubenswrapper[4778]: I1205 16:28:47.468739 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:47 crc kubenswrapper[4778]: I1205 16:28:47.797233 4778 scope.go:117] "RemoveContainer" containerID="b94d7768f2620eb17a6e6d0bdf0494f7a48628df8ecd656e000cdd771d23e956" Dec 05 16:28:47 crc kubenswrapper[4778]: I1205 16:28:47.836958 4778 scope.go:117] "RemoveContainer" containerID="eb32e4acd688ec98c6ee444f3c942582c9aba4e2af4117b37984f66d3a2a48ea" Dec 05 16:28:48 crc kubenswrapper[4778]: I1205 16:28:48.385765 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.088398 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tprfd"] Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.090136 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tprfd" podUID="f3e7d018-2ff1-43d3-ba88-2929e3b98be5" containerName="registry-server" containerID="cri-o://9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3" gracePeriod=2 Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.527888 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.715835 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-catalog-content\") pod \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\" (UID: \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\") " Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.715899 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-utilities\") pod \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\" (UID: \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\") " Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.715978 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfqr5\" (UniqueName: \"kubernetes.io/projected/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-kube-api-access-wfqr5\") pod \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\" (UID: \"f3e7d018-2ff1-43d3-ba88-2929e3b98be5\") " Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.717469 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-utilities" (OuterVolumeSpecName: "utilities") pod "f3e7d018-2ff1-43d3-ba88-2929e3b98be5" (UID: "f3e7d018-2ff1-43d3-ba88-2929e3b98be5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.722503 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-kube-api-access-wfqr5" (OuterVolumeSpecName: "kube-api-access-wfqr5") pod "f3e7d018-2ff1-43d3-ba88-2929e3b98be5" (UID: "f3e7d018-2ff1-43d3-ba88-2929e3b98be5"). InnerVolumeSpecName "kube-api-access-wfqr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.727530 4778 generic.go:334] "Generic (PLEG): container finished" podID="f3e7d018-2ff1-43d3-ba88-2929e3b98be5" containerID="9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3" exitCode=0 Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.727578 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tprfd" event={"ID":"f3e7d018-2ff1-43d3-ba88-2929e3b98be5","Type":"ContainerDied","Data":"9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3"} Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.727610 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tprfd" event={"ID":"f3e7d018-2ff1-43d3-ba88-2929e3b98be5","Type":"ContainerDied","Data":"f2ae55d1162afe21486be6634dbbdc812e8a6e63397ef401ba6967149e3b136d"} Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.727631 4778 scope.go:117] "RemoveContainer" containerID="9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.727700 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tprfd" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.767264 4778 scope.go:117] "RemoveContainer" containerID="27db4ef0df8aef410d2e84f83683f04733278511557dcd59d9b12823668acbd2" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.774984 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3e7d018-2ff1-43d3-ba88-2929e3b98be5" (UID: "f3e7d018-2ff1-43d3-ba88-2929e3b98be5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.792996 4778 scope.go:117] "RemoveContainer" containerID="8da5110225a563df1ac74603cdbf1abb39fe2fd4430a8e3a73da41298ded5b03" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.817069 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.817108 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.817120 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfqr5\" (UniqueName: \"kubernetes.io/projected/f3e7d018-2ff1-43d3-ba88-2929e3b98be5-kube-api-access-wfqr5\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.831842 4778 scope.go:117] "RemoveContainer" containerID="9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3" Dec 05 16:28:51 crc kubenswrapper[4778]: E1205 16:28:51.832302 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3\": container with ID starting with 9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3 not found: ID does not exist" containerID="9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.832339 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3"} err="failed to get container status \"9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3\": rpc error: code = NotFound desc = could not find container \"9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3\": container with ID starting with 9797a0e1d24d1fa1cf3f894cae1bb50f5e2dc9544857f20a4c7742a26dfbc8e3 not found: ID does not exist" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.832375 4778 scope.go:117] "RemoveContainer" containerID="27db4ef0df8aef410d2e84f83683f04733278511557dcd59d9b12823668acbd2" Dec 05 16:28:51 crc kubenswrapper[4778]: E1205 16:28:51.832740 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27db4ef0df8aef410d2e84f83683f04733278511557dcd59d9b12823668acbd2\": container with ID starting with 27db4ef0df8aef410d2e84f83683f04733278511557dcd59d9b12823668acbd2 not found: ID does not exist" containerID="27db4ef0df8aef410d2e84f83683f04733278511557dcd59d9b12823668acbd2" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.832779 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27db4ef0df8aef410d2e84f83683f04733278511557dcd59d9b12823668acbd2"} err="failed to get container status \"27db4ef0df8aef410d2e84f83683f04733278511557dcd59d9b12823668acbd2\": rpc error: code = NotFound desc = could not find container \"27db4ef0df8aef410d2e84f83683f04733278511557dcd59d9b12823668acbd2\": container with ID starting with 27db4ef0df8aef410d2e84f83683f04733278511557dcd59d9b12823668acbd2 not found: ID does not exist" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.832806 4778 scope.go:117] "RemoveContainer" containerID="8da5110225a563df1ac74603cdbf1abb39fe2fd4430a8e3a73da41298ded5b03" Dec 05 16:28:51 crc kubenswrapper[4778]: E1205 16:28:51.833211 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da5110225a563df1ac74603cdbf1abb39fe2fd4430a8e3a73da41298ded5b03\": container with ID starting with 8da5110225a563df1ac74603cdbf1abb39fe2fd4430a8e3a73da41298ded5b03 not found: ID does not exist" containerID="8da5110225a563df1ac74603cdbf1abb39fe2fd4430a8e3a73da41298ded5b03" Dec 05 16:28:51 crc kubenswrapper[4778]: I1205 16:28:51.833237 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da5110225a563df1ac74603cdbf1abb39fe2fd4430a8e3a73da41298ded5b03"} err="failed to get container status \"8da5110225a563df1ac74603cdbf1abb39fe2fd4430a8e3a73da41298ded5b03\": rpc error: code = NotFound desc = could not find container \"8da5110225a563df1ac74603cdbf1abb39fe2fd4430a8e3a73da41298ded5b03\": container with ID starting with 8da5110225a563df1ac74603cdbf1abb39fe2fd4430a8e3a73da41298ded5b03 not found: ID does not exist" Dec 05 16:28:52 crc kubenswrapper[4778]: I1205 16:28:52.062527 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tprfd"] Dec 05 16:28:52 crc kubenswrapper[4778]: I1205 16:28:52.071512 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tprfd"] Dec 05 16:28:53 crc kubenswrapper[4778]: I1205 16:28:53.261766 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3e7d018-2ff1-43d3-ba88-2929e3b98be5" path="/var/lib/kubelet/pods/f3e7d018-2ff1-43d3-ba88-2929e3b98be5/volumes" Dec 05 16:28:54 crc kubenswrapper[4778]: I1205 16:28:54.249148 4778 scope.go:117] "RemoveContainer" containerID="568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072" Dec 05 16:28:54 crc kubenswrapper[4778]: E1205 16:28:54.249841 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:29:03 crc kubenswrapper[4778]: I1205 16:29:03.414936 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:29:03 crc kubenswrapper[4778]: I1205 16:29:03.415789 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:29:03 crc kubenswrapper[4778]: I1205 16:29:03.415872 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:29:03 crc kubenswrapper[4778]: I1205 16:29:03.417026 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"07f810bc57816803590a70eb4acea4f15e7b3ff3f3449761e7328bfde876054a"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:29:03 crc kubenswrapper[4778]: I1205 16:29:03.417156 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://07f810bc57816803590a70eb4acea4f15e7b3ff3f3449761e7328bfde876054a" gracePeriod=600 Dec 05 16:29:03 crc kubenswrapper[4778]: I1205 16:29:03.832524 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="07f810bc57816803590a70eb4acea4f15e7b3ff3f3449761e7328bfde876054a" exitCode=0 Dec 05 16:29:03 crc kubenswrapper[4778]: I1205 16:29:03.832601 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"07f810bc57816803590a70eb4acea4f15e7b3ff3f3449761e7328bfde876054a"} Dec 05 16:29:03 crc kubenswrapper[4778]: I1205 16:29:03.832844 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c"} Dec 05 16:29:03 crc kubenswrapper[4778]: I1205 16:29:03.832867 4778 scope.go:117] "RemoveContainer" containerID="ced1b4b93cb56c64e19f60ef165f873b30169851acf91b1ba69e38dad6d7a8f0" Dec 05 16:29:08 crc kubenswrapper[4778]: I1205 16:29:08.249404 4778 scope.go:117] "RemoveContainer" containerID="568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072" Dec 05 16:29:08 crc kubenswrapper[4778]: E1205 16:29:08.250318 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.104680 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vxskq"] Dec 05 16:29:15 crc kubenswrapper[4778]: E1205 16:29:15.105853 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e7d018-2ff1-43d3-ba88-2929e3b98be5" containerName="extract-content" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.105878 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e7d018-2ff1-43d3-ba88-2929e3b98be5" containerName="extract-content" Dec 05 16:29:15 crc kubenswrapper[4778]: E1205 16:29:15.105903 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e7d018-2ff1-43d3-ba88-2929e3b98be5" containerName="extract-utilities" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.105913 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e7d018-2ff1-43d3-ba88-2929e3b98be5" containerName="extract-utilities" Dec 05 16:29:15 crc kubenswrapper[4778]: E1205 16:29:15.105927 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e7d018-2ff1-43d3-ba88-2929e3b98be5" containerName="registry-server" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.105935 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e7d018-2ff1-43d3-ba88-2929e3b98be5" containerName="registry-server" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.106175 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e7d018-2ff1-43d3-ba88-2929e3b98be5" containerName="registry-server" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.108098 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.127272 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vxskq"] Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.201648 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-utilities\") pod \"redhat-operators-vxskq\" (UID: \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\") " pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.201727 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5ckz\" (UniqueName: \"kubernetes.io/projected/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-kube-api-access-m5ckz\") pod \"redhat-operators-vxskq\" (UID: \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\") " pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.201759 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-catalog-content\") pod \"redhat-operators-vxskq\" (UID: \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\") " pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.303043 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-utilities\") pod \"redhat-operators-vxskq\" (UID: \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\") " pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.303111 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5ckz\" (UniqueName: \"kubernetes.io/projected/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-kube-api-access-m5ckz\") pod \"redhat-operators-vxskq\" (UID: \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\") " pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.303131 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-catalog-content\") pod \"redhat-operators-vxskq\" (UID: \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\") " pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.303618 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-catalog-content\") pod \"redhat-operators-vxskq\" (UID: \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\") " pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.303882 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-utilities\") pod \"redhat-operators-vxskq\" (UID: \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\") " pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.322915 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5ckz\" (UniqueName: \"kubernetes.io/projected/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-kube-api-access-m5ckz\") pod \"redhat-operators-vxskq\" (UID: \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\") " pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.434332 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.902838 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vxskq"] Dec 05 16:29:15 crc kubenswrapper[4778]: I1205 16:29:15.947682 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxskq" event={"ID":"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d","Type":"ContainerStarted","Data":"a6c2dcadaef6e59ceb263e3bc614178d11090403880374df5908953a0aeb0de8"} Dec 05 16:29:16 crc kubenswrapper[4778]: I1205 16:29:16.956012 4778 generic.go:334] "Generic (PLEG): container finished" podID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" containerID="56a29f80108fdd75177bb40b46c7faf32ef7316fe31e6998ea1b167b77948b7a" exitCode=0 Dec 05 16:29:16 crc kubenswrapper[4778]: I1205 16:29:16.956060 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxskq" event={"ID":"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d","Type":"ContainerDied","Data":"56a29f80108fdd75177bb40b46c7faf32ef7316fe31e6998ea1b167b77948b7a"} Dec 05 16:29:17 crc kubenswrapper[4778]: I1205 16:29:17.965130 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxskq" event={"ID":"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d","Type":"ContainerStarted","Data":"5c6fe028c28f99b325dd39598ffc5f4abad67a780cf27b815dd767a70aa71488"} Dec 05 16:29:20 crc kubenswrapper[4778]: I1205 16:29:20.995347 4778 generic.go:334] "Generic (PLEG): container finished" podID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" containerID="5c6fe028c28f99b325dd39598ffc5f4abad67a780cf27b815dd767a70aa71488" exitCode=0 Dec 05 16:29:20 crc kubenswrapper[4778]: I1205 16:29:20.995394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxskq" event={"ID":"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d","Type":"ContainerDied","Data":"5c6fe028c28f99b325dd39598ffc5f4abad67a780cf27b815dd767a70aa71488"} Dec 05 16:29:21 crc kubenswrapper[4778]: I1205 16:29:21.250169 4778 scope.go:117] "RemoveContainer" containerID="568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072" Dec 05 16:29:23 crc kubenswrapper[4778]: I1205 16:29:23.015749 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxskq" event={"ID":"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d","Type":"ContainerStarted","Data":"11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5"} Dec 05 16:29:23 crc kubenswrapper[4778]: I1205 16:29:23.018897 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerStarted","Data":"26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a"} Dec 05 16:29:23 crc kubenswrapper[4778]: I1205 16:29:23.041011 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vxskq" podStartSLOduration=2.811463491 podStartE2EDuration="8.040989849s" podCreationTimestamp="2025-12-05 16:29:15 +0000 UTC" firstStartedPulling="2025-12-05 16:29:16.957491615 +0000 UTC m=+2044.061287995" lastFinishedPulling="2025-12-05 16:29:22.187017973 +0000 UTC m=+2049.290814353" observedRunningTime="2025-12-05 16:29:23.032794369 +0000 UTC m=+2050.136590759" watchObservedRunningTime="2025-12-05 16:29:23.040989849 +0000 UTC m=+2050.144786239" Dec 05 16:29:25 crc kubenswrapper[4778]: I1205 16:29:25.435091 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:25 crc kubenswrapper[4778]: I1205 16:29:25.435592 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:25 crc kubenswrapper[4778]: I1205 16:29:25.522735 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:29:25 crc kubenswrapper[4778]: E1205 16:29:25.523422 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a is running failed: container process not found" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 16:29:25 crc kubenswrapper[4778]: E1205 16:29:25.524490 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a is running failed: container process not found" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 16:29:25 crc kubenswrapper[4778]: E1205 16:29:25.524829 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a is running failed: container process not found" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 16:29:25 crc kubenswrapper[4778]: E1205 16:29:25.524855 4778 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a is running failed: container process not found" probeType="Startup" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:29:26 crc kubenswrapper[4778]: I1205 16:29:26.044969 4778 generic.go:334] "Generic (PLEG): container finished" podID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" exitCode=1 Dec 05 16:29:26 crc kubenswrapper[4778]: I1205 16:29:26.045020 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerDied","Data":"26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a"} Dec 05 16:29:26 crc kubenswrapper[4778]: I1205 16:29:26.045075 4778 scope.go:117] "RemoveContainer" containerID="568fd01abd30ca7db574370e8859779046d6a400ccdbd56d8dd4d77f9c033072" Dec 05 16:29:26 crc kubenswrapper[4778]: I1205 16:29:26.045668 4778 scope.go:117] "RemoveContainer" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" Dec 05 16:29:26 crc kubenswrapper[4778]: E1205 16:29:26.045875 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:29:26 crc kubenswrapper[4778]: I1205 16:29:26.484386 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vxskq" podUID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" containerName="registry-server" probeResult="failure" output=< Dec 05 16:29:26 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Dec 05 16:29:26 crc kubenswrapper[4778]: > Dec 05 16:29:35 crc kubenswrapper[4778]: I1205 16:29:35.482278 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:35 crc kubenswrapper[4778]: I1205 16:29:35.522626 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:29:35 crc kubenswrapper[4778]: I1205 16:29:35.523005 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:29:35 crc kubenswrapper[4778]: I1205 16:29:35.523024 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:29:35 crc kubenswrapper[4778]: I1205 16:29:35.523656 4778 scope.go:117] "RemoveContainer" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" Dec 05 16:29:35 crc kubenswrapper[4778]: E1205 16:29:35.523907 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:29:35 crc kubenswrapper[4778]: I1205 16:29:35.532985 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:36 crc kubenswrapper[4778]: I1205 16:29:36.127161 4778 scope.go:117] "RemoveContainer" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" Dec 05 16:29:36 crc kubenswrapper[4778]: E1205 16:29:36.127381 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:29:39 crc kubenswrapper[4778]: I1205 16:29:39.093619 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vxskq"] Dec 05 16:29:39 crc kubenswrapper[4778]: I1205 16:29:39.094226 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vxskq" podUID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" containerName="registry-server" containerID="cri-o://11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5" gracePeriod=2 Dec 05 16:29:39 crc kubenswrapper[4778]: E1205 16:29:39.186201 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dc5d977_f5b6_4297_a0ce_1bb144d6c10d.slice/crio-11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dc5d977_f5b6_4297_a0ce_1bb144d6c10d.slice/crio-conmon-11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:29:39 crc kubenswrapper[4778]: I1205 16:29:39.536157 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:39 crc kubenswrapper[4778]: I1205 16:29:39.717744 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-utilities\") pod \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\" (UID: \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\") " Dec 05 16:29:39 crc kubenswrapper[4778]: I1205 16:29:39.717886 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-catalog-content\") pod \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\" (UID: \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\") " Dec 05 16:29:39 crc kubenswrapper[4778]: I1205 16:29:39.718031 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5ckz\" (UniqueName: \"kubernetes.io/projected/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-kube-api-access-m5ckz\") pod \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\" (UID: \"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d\") " Dec 05 16:29:39 crc kubenswrapper[4778]: I1205 16:29:39.718968 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-utilities" (OuterVolumeSpecName: "utilities") pod "0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" (UID: "0dc5d977-f5b6-4297-a0ce-1bb144d6c10d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:29:39 crc kubenswrapper[4778]: I1205 16:29:39.725374 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-kube-api-access-m5ckz" (OuterVolumeSpecName: "kube-api-access-m5ckz") pod "0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" (UID: "0dc5d977-f5b6-4297-a0ce-1bb144d6c10d"). InnerVolumeSpecName "kube-api-access-m5ckz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:29:39 crc kubenswrapper[4778]: I1205 16:29:39.820113 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:39 crc kubenswrapper[4778]: I1205 16:29:39.820145 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5ckz\" (UniqueName: \"kubernetes.io/projected/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-kube-api-access-m5ckz\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:39 crc kubenswrapper[4778]: I1205 16:29:39.842933 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" (UID: "0dc5d977-f5b6-4297-a0ce-1bb144d6c10d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:29:39 crc kubenswrapper[4778]: I1205 16:29:39.921671 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.163745 4778 generic.go:334] "Generic (PLEG): container finished" podID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" containerID="11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5" exitCode=0 Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.163785 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxskq" event={"ID":"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d","Type":"ContainerDied","Data":"11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5"} Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.163812 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxskq" event={"ID":"0dc5d977-f5b6-4297-a0ce-1bb144d6c10d","Type":"ContainerDied","Data":"a6c2dcadaef6e59ceb263e3bc614178d11090403880374df5908953a0aeb0de8"} Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.163845 4778 scope.go:117] "RemoveContainer" containerID="11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5" Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.163845 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxskq" Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.206423 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vxskq"] Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.209059 4778 scope.go:117] "RemoveContainer" containerID="5c6fe028c28f99b325dd39598ffc5f4abad67a780cf27b815dd767a70aa71488" Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.213079 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vxskq"] Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.240633 4778 scope.go:117] "RemoveContainer" containerID="56a29f80108fdd75177bb40b46c7faf32ef7316fe31e6998ea1b167b77948b7a" Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.290573 4778 scope.go:117] "RemoveContainer" containerID="11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5" Dec 05 16:29:40 crc kubenswrapper[4778]: E1205 16:29:40.297065 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5\": container with ID starting with 11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5 not found: ID does not exist" containerID="11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5" Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.297144 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5"} err="failed to get container status \"11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5\": rpc error: code = NotFound desc = could not find container \"11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5\": container with ID starting with 11f69e31c9918b0129fb0815ed130e82687d4030e79a4aa98ef1b72c76ad8fb5 not found: ID does not exist" Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.297187 4778 scope.go:117] "RemoveContainer" containerID="5c6fe028c28f99b325dd39598ffc5f4abad67a780cf27b815dd767a70aa71488" Dec 05 16:29:40 crc kubenswrapper[4778]: E1205 16:29:40.297622 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c6fe028c28f99b325dd39598ffc5f4abad67a780cf27b815dd767a70aa71488\": container with ID starting with 5c6fe028c28f99b325dd39598ffc5f4abad67a780cf27b815dd767a70aa71488 not found: ID does not exist" containerID="5c6fe028c28f99b325dd39598ffc5f4abad67a780cf27b815dd767a70aa71488" Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.297833 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6fe028c28f99b325dd39598ffc5f4abad67a780cf27b815dd767a70aa71488"} err="failed to get container status \"5c6fe028c28f99b325dd39598ffc5f4abad67a780cf27b815dd767a70aa71488\": rpc error: code = NotFound desc = could not find container \"5c6fe028c28f99b325dd39598ffc5f4abad67a780cf27b815dd767a70aa71488\": container with ID starting with 5c6fe028c28f99b325dd39598ffc5f4abad67a780cf27b815dd767a70aa71488 not found: ID does not exist" Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.297962 4778 scope.go:117] "RemoveContainer" containerID="56a29f80108fdd75177bb40b46c7faf32ef7316fe31e6998ea1b167b77948b7a" Dec 05 16:29:40 crc kubenswrapper[4778]: E1205 16:29:40.298488 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a29f80108fdd75177bb40b46c7faf32ef7316fe31e6998ea1b167b77948b7a\": container with ID starting with 56a29f80108fdd75177bb40b46c7faf32ef7316fe31e6998ea1b167b77948b7a not found: ID does not exist" containerID="56a29f80108fdd75177bb40b46c7faf32ef7316fe31e6998ea1b167b77948b7a" Dec 05 16:29:40 crc kubenswrapper[4778]: I1205 16:29:40.298527 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a29f80108fdd75177bb40b46c7faf32ef7316fe31e6998ea1b167b77948b7a"} err="failed to get container status \"56a29f80108fdd75177bb40b46c7faf32ef7316fe31e6998ea1b167b77948b7a\": rpc error: code = NotFound desc = could not find container \"56a29f80108fdd75177bb40b46c7faf32ef7316fe31e6998ea1b167b77948b7a\": container with ID starting with 56a29f80108fdd75177bb40b46c7faf32ef7316fe31e6998ea1b167b77948b7a not found: ID does not exist" Dec 05 16:29:41 crc kubenswrapper[4778]: I1205 16:29:41.264429 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" path="/var/lib/kubelet/pods/0dc5d977-f5b6-4297-a0ce-1bb144d6c10d/volumes" Dec 05 16:29:50 crc kubenswrapper[4778]: I1205 16:29:50.250312 4778 scope.go:117] "RemoveContainer" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" Dec 05 16:29:50 crc kubenswrapper[4778]: E1205 16:29:50.251480 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.161074 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp"] Dec 05 16:30:00 crc kubenswrapper[4778]: E1205 16:30:00.162287 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" containerName="registry-server" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.162307 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" containerName="registry-server" Dec 05 16:30:00 crc kubenswrapper[4778]: E1205 16:30:00.162347 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" containerName="extract-content" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.162356 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" containerName="extract-content" Dec 05 16:30:00 crc kubenswrapper[4778]: E1205 16:30:00.162390 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" containerName="extract-utilities" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.162403 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" containerName="extract-utilities" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.162636 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc5d977-f5b6-4297-a0ce-1bb144d6c10d" containerName="registry-server" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.163343 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.166199 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.166215 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.172286 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp"] Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.189853 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-secret-volume\") pod \"collect-profiles-29415870-2pbkp\" (UID: \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.189935 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-config-volume\") pod \"collect-profiles-29415870-2pbkp\" (UID: \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.190013 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfnzv\" (UniqueName: \"kubernetes.io/projected/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-kube-api-access-kfnzv\") pod \"collect-profiles-29415870-2pbkp\" (UID: \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.291942 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-secret-volume\") pod \"collect-profiles-29415870-2pbkp\" (UID: \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.292031 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-config-volume\") pod \"collect-profiles-29415870-2pbkp\" (UID: \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.292055 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfnzv\" (UniqueName: \"kubernetes.io/projected/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-kube-api-access-kfnzv\") pod \"collect-profiles-29415870-2pbkp\" (UID: \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.293177 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-config-volume\") pod \"collect-profiles-29415870-2pbkp\" (UID: \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.297561 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-secret-volume\") pod \"collect-profiles-29415870-2pbkp\" (UID: \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.311030 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfnzv\" (UniqueName: \"kubernetes.io/projected/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-kube-api-access-kfnzv\") pod \"collect-profiles-29415870-2pbkp\" (UID: \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.482530 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:00 crc kubenswrapper[4778]: I1205 16:30:00.913256 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp"] Dec 05 16:30:00 crc kubenswrapper[4778]: W1205 16:30:00.918133 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f47caf0_b8c4_49eb_82f5_c76b33c6828f.slice/crio-addf1e17d4b07ba9998331f8ad6529f4cf89be31b7681d2fe6f4b6853a439633 WatchSource:0}: Error finding container addf1e17d4b07ba9998331f8ad6529f4cf89be31b7681d2fe6f4b6853a439633: Status 404 returned error can't find the container with id addf1e17d4b07ba9998331f8ad6529f4cf89be31b7681d2fe6f4b6853a439633 Dec 05 16:30:01 crc kubenswrapper[4778]: E1205 16:30:01.077737 4778 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.130:51114->38.102.83.130:42485: write tcp 38.102.83.130:51114->38.102.83.130:42485: write: broken pipe Dec 05 16:30:01 crc kubenswrapper[4778]: I1205 16:30:01.375822 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" event={"ID":"8f47caf0-b8c4-49eb-82f5-c76b33c6828f","Type":"ContainerStarted","Data":"94fbc07620a2a279e6f1f50d317e8639fbb967f83e104e7ca5dc451db791f140"} Dec 05 16:30:01 crc kubenswrapper[4778]: I1205 16:30:01.375864 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" event={"ID":"8f47caf0-b8c4-49eb-82f5-c76b33c6828f","Type":"ContainerStarted","Data":"addf1e17d4b07ba9998331f8ad6529f4cf89be31b7681d2fe6f4b6853a439633"} Dec 05 16:30:01 crc kubenswrapper[4778]: I1205 16:30:01.404960 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" podStartSLOduration=1.404938875 podStartE2EDuration="1.404938875s" podCreationTimestamp="2025-12-05 16:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:30:01.396652322 +0000 UTC m=+2088.500448702" watchObservedRunningTime="2025-12-05 16:30:01.404938875 +0000 UTC m=+2088.508735255" Dec 05 16:30:02 crc kubenswrapper[4778]: I1205 16:30:02.385915 4778 generic.go:334] "Generic (PLEG): container finished" podID="8f47caf0-b8c4-49eb-82f5-c76b33c6828f" containerID="94fbc07620a2a279e6f1f50d317e8639fbb967f83e104e7ca5dc451db791f140" exitCode=0 Dec 05 16:30:02 crc kubenswrapper[4778]: I1205 16:30:02.385986 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" event={"ID":"8f47caf0-b8c4-49eb-82f5-c76b33c6828f","Type":"ContainerDied","Data":"94fbc07620a2a279e6f1f50d317e8639fbb967f83e104e7ca5dc451db791f140"} Dec 05 16:30:03 crc kubenswrapper[4778]: I1205 16:30:03.754502 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:03 crc kubenswrapper[4778]: I1205 16:30:03.891324 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-secret-volume\") pod \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\" (UID: \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\") " Dec 05 16:30:03 crc kubenswrapper[4778]: I1205 16:30:03.891493 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-config-volume\") pod \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\" (UID: \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\") " Dec 05 16:30:03 crc kubenswrapper[4778]: I1205 16:30:03.891637 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfnzv\" (UniqueName: \"kubernetes.io/projected/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-kube-api-access-kfnzv\") pod \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\" (UID: \"8f47caf0-b8c4-49eb-82f5-c76b33c6828f\") " Dec 05 16:30:03 crc kubenswrapper[4778]: I1205 16:30:03.892241 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-config-volume" (OuterVolumeSpecName: "config-volume") pod "8f47caf0-b8c4-49eb-82f5-c76b33c6828f" (UID: "8f47caf0-b8c4-49eb-82f5-c76b33c6828f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:30:03 crc kubenswrapper[4778]: I1205 16:30:03.897504 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8f47caf0-b8c4-49eb-82f5-c76b33c6828f" (UID: "8f47caf0-b8c4-49eb-82f5-c76b33c6828f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:30:03 crc kubenswrapper[4778]: I1205 16:30:03.900148 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-kube-api-access-kfnzv" (OuterVolumeSpecName: "kube-api-access-kfnzv") pod "8f47caf0-b8c4-49eb-82f5-c76b33c6828f" (UID: "8f47caf0-b8c4-49eb-82f5-c76b33c6828f"). InnerVolumeSpecName "kube-api-access-kfnzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:30:03 crc kubenswrapper[4778]: I1205 16:30:03.993483 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfnzv\" (UniqueName: \"kubernetes.io/projected/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-kube-api-access-kfnzv\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:03 crc kubenswrapper[4778]: I1205 16:30:03.993516 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:03 crc kubenswrapper[4778]: I1205 16:30:03.993527 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f47caf0-b8c4-49eb-82f5-c76b33c6828f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:04 crc kubenswrapper[4778]: I1205 16:30:04.250357 4778 scope.go:117] "RemoveContainer" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" Dec 05 16:30:04 crc kubenswrapper[4778]: E1205 16:30:04.250926 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:30:04 crc kubenswrapper[4778]: I1205 16:30:04.408658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" event={"ID":"8f47caf0-b8c4-49eb-82f5-c76b33c6828f","Type":"ContainerDied","Data":"addf1e17d4b07ba9998331f8ad6529f4cf89be31b7681d2fe6f4b6853a439633"} Dec 05 16:30:04 crc kubenswrapper[4778]: I1205 16:30:04.408703 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="addf1e17d4b07ba9998331f8ad6529f4cf89be31b7681d2fe6f4b6853a439633" Dec 05 16:30:04 crc kubenswrapper[4778]: I1205 16:30:04.408708 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-2pbkp" Dec 05 16:30:04 crc kubenswrapper[4778]: I1205 16:30:04.839786 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd"] Dec 05 16:30:04 crc kubenswrapper[4778]: I1205 16:30:04.847033 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415825-9mpnd"] Dec 05 16:30:05 crc kubenswrapper[4778]: I1205 16:30:05.259701 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00bbab66-fa02-4505-8afb-d9d9c1370d95" path="/var/lib/kubelet/pods/00bbab66-fa02-4505-8afb-d9d9c1370d95/volumes" Dec 05 16:30:17 crc kubenswrapper[4778]: I1205 16:30:17.248928 4778 scope.go:117] "RemoveContainer" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" Dec 05 16:30:17 crc kubenswrapper[4778]: E1205 16:30:17.249646 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:30:28 crc kubenswrapper[4778]: I1205 16:30:28.250078 4778 scope.go:117] "RemoveContainer" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" Dec 05 16:30:28 crc kubenswrapper[4778]: E1205 16:30:28.251064 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:30:39 crc kubenswrapper[4778]: I1205 16:30:39.249856 4778 scope.go:117] "RemoveContainer" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" Dec 05 16:30:39 crc kubenswrapper[4778]: E1205 16:30:39.250591 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:30:47 crc kubenswrapper[4778]: I1205 16:30:47.995049 4778 scope.go:117] "RemoveContainer" containerID="5ed5f18a843b48f50604e98b811e271036ea4e3626b486832ce9e95c37f718de" Dec 05 16:30:50 crc kubenswrapper[4778]: I1205 16:30:50.249872 4778 scope.go:117] "RemoveContainer" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" Dec 05 16:30:50 crc kubenswrapper[4778]: E1205 16:30:50.250509 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(b3d171fd-e9d8-4778-917f-ccfad7c27404)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.040964 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm"] Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.050956 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-wbpzm"] Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.093079 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watchere724-account-delete-54jj6"] Dec 05 16:30:54 crc kubenswrapper[4778]: E1205 16:30:54.093464 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f47caf0-b8c4-49eb-82f5-c76b33c6828f" containerName="collect-profiles" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.093480 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f47caf0-b8c4-49eb-82f5-c76b33c6828f" containerName="collect-profiles" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.093633 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f47caf0-b8c4-49eb-82f5-c76b33c6828f" containerName="collect-profiles" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.094190 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchere724-account-delete-54jj6" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.113452 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchere724-account-delete-54jj6"] Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.148849 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh498\" (UniqueName: \"kubernetes.io/projected/423fd545-0438-4c3e-be37-b48929655ae8-kube-api-access-zh498\") pod \"watchere724-account-delete-54jj6\" (UID: \"423fd545-0438-4c3e-be37-b48929655ae8\") " pod="watcher-kuttl-default/watchere724-account-delete-54jj6" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.148989 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/423fd545-0438-4c3e-be37-b48929655ae8-operator-scripts\") pod \"watchere724-account-delete-54jj6\" (UID: \"423fd545-0438-4c3e-be37-b48929655ae8\") " pod="watcher-kuttl-default/watchere724-account-delete-54jj6" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.159042 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.159290 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="828b1128-474f-4ce3-a67d-b0f9ac493824" containerName="watcher-applier" containerID="cri-o://77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a" gracePeriod=30 Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.180019 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.210799 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.211047 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="0e8805fe-ca05-4473-a495-51825049a597" containerName="watcher-kuttl-api-log" containerID="cri-o://92f1c32d521a3ad41b7358af7fd8f9efca8bd5c33ea07ba39afbd9efdcc0d083" gracePeriod=30 Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.211181 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="0e8805fe-ca05-4473-a495-51825049a597" containerName="watcher-api" containerID="cri-o://4d7ffc071fc8e981492fa34ef8460623f6088a38943ae8b307b43d627d7ab2ba" gracePeriod=30 Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.250296 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/423fd545-0438-4c3e-be37-b48929655ae8-operator-scripts\") pod \"watchere724-account-delete-54jj6\" (UID: \"423fd545-0438-4c3e-be37-b48929655ae8\") " pod="watcher-kuttl-default/watchere724-account-delete-54jj6" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.250421 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh498\" (UniqueName: \"kubernetes.io/projected/423fd545-0438-4c3e-be37-b48929655ae8-kube-api-access-zh498\") pod \"watchere724-account-delete-54jj6\" (UID: \"423fd545-0438-4c3e-be37-b48929655ae8\") " pod="watcher-kuttl-default/watchere724-account-delete-54jj6" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.251575 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/423fd545-0438-4c3e-be37-b48929655ae8-operator-scripts\") pod \"watchere724-account-delete-54jj6\" (UID: \"423fd545-0438-4c3e-be37-b48929655ae8\") " pod="watcher-kuttl-default/watchere724-account-delete-54jj6" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.291943 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh498\" (UniqueName: \"kubernetes.io/projected/423fd545-0438-4c3e-be37-b48929655ae8-kube-api-access-zh498\") pod \"watchere724-account-delete-54jj6\" (UID: \"423fd545-0438-4c3e-be37-b48929655ae8\") " pod="watcher-kuttl-default/watchere724-account-delete-54jj6" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.413703 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchere724-account-delete-54jj6" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.601994 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.666747 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3d171fd-e9d8-4778-917f-ccfad7c27404-logs\") pod \"b3d171fd-e9d8-4778-917f-ccfad7c27404\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.666836 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22twg\" (UniqueName: \"kubernetes.io/projected/b3d171fd-e9d8-4778-917f-ccfad7c27404-kube-api-access-22twg\") pod \"b3d171fd-e9d8-4778-917f-ccfad7c27404\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.666856 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-custom-prometheus-ca\") pod \"b3d171fd-e9d8-4778-917f-ccfad7c27404\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.666880 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-config-data\") pod \"b3d171fd-e9d8-4778-917f-ccfad7c27404\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.667058 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-combined-ca-bundle\") pod \"b3d171fd-e9d8-4778-917f-ccfad7c27404\" (UID: \"b3d171fd-e9d8-4778-917f-ccfad7c27404\") " Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.672645 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d171fd-e9d8-4778-917f-ccfad7c27404-logs" (OuterVolumeSpecName: "logs") pod "b3d171fd-e9d8-4778-917f-ccfad7c27404" (UID: "b3d171fd-e9d8-4778-917f-ccfad7c27404"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.685573 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d171fd-e9d8-4778-917f-ccfad7c27404-kube-api-access-22twg" (OuterVolumeSpecName: "kube-api-access-22twg") pod "b3d171fd-e9d8-4778-917f-ccfad7c27404" (UID: "b3d171fd-e9d8-4778-917f-ccfad7c27404"). InnerVolumeSpecName "kube-api-access-22twg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.706036 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b3d171fd-e9d8-4778-917f-ccfad7c27404" (UID: "b3d171fd-e9d8-4778-917f-ccfad7c27404"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.728529 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3d171fd-e9d8-4778-917f-ccfad7c27404" (UID: "b3d171fd-e9d8-4778-917f-ccfad7c27404"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.739031 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-config-data" (OuterVolumeSpecName: "config-data") pod "b3d171fd-e9d8-4778-917f-ccfad7c27404" (UID: "b3d171fd-e9d8-4778-917f-ccfad7c27404"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.769834 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.769875 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3d171fd-e9d8-4778-917f-ccfad7c27404-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.769889 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22twg\" (UniqueName: \"kubernetes.io/projected/b3d171fd-e9d8-4778-917f-ccfad7c27404-kube-api-access-22twg\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.769903 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.769913 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d171fd-e9d8-4778-917f-ccfad7c27404-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.857597 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.857589 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b3d171fd-e9d8-4778-917f-ccfad7c27404","Type":"ContainerDied","Data":"c3b10391d04d3a880f10a8d6cd0da35a9bdb486f85e5e3dcd58da4536702fb14"} Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.858791 4778 scope.go:117] "RemoveContainer" containerID="26db4b1f6b322eaa11394ba3d3f219ae515b7276fc25db8e393386b6ee94dc7a" Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.864057 4778 generic.go:334] "Generic (PLEG): container finished" podID="0e8805fe-ca05-4473-a495-51825049a597" containerID="92f1c32d521a3ad41b7358af7fd8f9efca8bd5c33ea07ba39afbd9efdcc0d083" exitCode=143 Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.864108 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"0e8805fe-ca05-4473-a495-51825049a597","Type":"ContainerDied","Data":"92f1c32d521a3ad41b7358af7fd8f9efca8bd5c33ea07ba39afbd9efdcc0d083"} Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.914127 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:30:54 crc kubenswrapper[4778]: I1205 16:30:54.937879 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:30:55 crc kubenswrapper[4778]: I1205 16:30:55.028608 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchere724-account-delete-54jj6"] Dec 05 16:30:55 crc kubenswrapper[4778]: I1205 16:30:55.259308 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48692f6b-de46-4fc0-88c1-85eb3a003a63" path="/var/lib/kubelet/pods/48692f6b-de46-4fc0-88c1-85eb3a003a63/volumes" Dec 05 16:30:55 crc kubenswrapper[4778]: I1205 16:30:55.260298 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" path="/var/lib/kubelet/pods/b3d171fd-e9d8-4778-917f-ccfad7c27404/volumes" Dec 05 16:30:55 crc kubenswrapper[4778]: I1205 16:30:55.511232 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="0e8805fe-ca05-4473-a495-51825049a597" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"https://10.217.0.148:9322/\": dial tcp 10.217.0.148:9322: connect: connection refused" Dec 05 16:30:55 crc kubenswrapper[4778]: I1205 16:30:55.511646 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="0e8805fe-ca05-4473-a495-51825049a597" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.148:9322/\": dial tcp 10.217.0.148:9322: connect: connection refused" Dec 05 16:30:55 crc kubenswrapper[4778]: E1205 16:30:55.591575 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:30:55 crc kubenswrapper[4778]: E1205 16:30:55.594693 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:30:55 crc kubenswrapper[4778]: E1205 16:30:55.599896 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:30:55 crc kubenswrapper[4778]: E1205 16:30:55.599994 4778 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="828b1128-474f-4ce3-a67d-b0f9ac493824" containerName="watcher-applier" Dec 05 16:30:55 crc kubenswrapper[4778]: I1205 16:30:55.874714 4778 generic.go:334] "Generic (PLEG): container finished" podID="0e8805fe-ca05-4473-a495-51825049a597" containerID="4d7ffc071fc8e981492fa34ef8460623f6088a38943ae8b307b43d627d7ab2ba" exitCode=0 Dec 05 16:30:55 crc kubenswrapper[4778]: I1205 16:30:55.874773 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"0e8805fe-ca05-4473-a495-51825049a597","Type":"ContainerDied","Data":"4d7ffc071fc8e981492fa34ef8460623f6088a38943ae8b307b43d627d7ab2ba"} Dec 05 16:30:55 crc kubenswrapper[4778]: I1205 16:30:55.876983 4778 generic.go:334] "Generic (PLEG): container finished" podID="423fd545-0438-4c3e-be37-b48929655ae8" containerID="d85bc21af86055dc7c9fb771fd32cfac038318826a7ac50e0a6e7a35d85466c8" exitCode=0 Dec 05 16:30:55 crc kubenswrapper[4778]: I1205 16:30:55.877033 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchere724-account-delete-54jj6" event={"ID":"423fd545-0438-4c3e-be37-b48929655ae8","Type":"ContainerDied","Data":"d85bc21af86055dc7c9fb771fd32cfac038318826a7ac50e0a6e7a35d85466c8"} Dec 05 16:30:55 crc kubenswrapper[4778]: I1205 16:30:55.877074 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchere724-account-delete-54jj6" event={"ID":"423fd545-0438-4c3e-be37-b48929655ae8","Type":"ContainerStarted","Data":"5a72522d661a8fce8d316b6a642abd8169b7f9cc8d28ac60023116f98d2a306f"} Dec 05 16:30:55 crc kubenswrapper[4778]: I1205 16:30:55.951064 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.018800 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cns2f\" (UniqueName: \"kubernetes.io/projected/0e8805fe-ca05-4473-a495-51825049a597-kube-api-access-cns2f\") pod \"0e8805fe-ca05-4473-a495-51825049a597\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.018880 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-custom-prometheus-ca\") pod \"0e8805fe-ca05-4473-a495-51825049a597\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.018916 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-combined-ca-bundle\") pod \"0e8805fe-ca05-4473-a495-51825049a597\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.018956 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-internal-tls-certs\") pod \"0e8805fe-ca05-4473-a495-51825049a597\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.019019 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-public-tls-certs\") pod \"0e8805fe-ca05-4473-a495-51825049a597\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.019093 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e8805fe-ca05-4473-a495-51825049a597-logs\") pod \"0e8805fe-ca05-4473-a495-51825049a597\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.019157 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-config-data\") pod \"0e8805fe-ca05-4473-a495-51825049a597\" (UID: \"0e8805fe-ca05-4473-a495-51825049a597\") " Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.054687 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e8805fe-ca05-4473-a495-51825049a597-logs" (OuterVolumeSpecName: "logs") pod "0e8805fe-ca05-4473-a495-51825049a597" (UID: "0e8805fe-ca05-4473-a495-51825049a597"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.055680 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e8805fe-ca05-4473-a495-51825049a597-kube-api-access-cns2f" (OuterVolumeSpecName: "kube-api-access-cns2f") pod "0e8805fe-ca05-4473-a495-51825049a597" (UID: "0e8805fe-ca05-4473-a495-51825049a597"). InnerVolumeSpecName "kube-api-access-cns2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.086609 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e8805fe-ca05-4473-a495-51825049a597" (UID: "0e8805fe-ca05-4473-a495-51825049a597"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.123691 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e8805fe-ca05-4473-a495-51825049a597-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.123727 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cns2f\" (UniqueName: \"kubernetes.io/projected/0e8805fe-ca05-4473-a495-51825049a597-kube-api-access-cns2f\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.123738 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.164504 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "0e8805fe-ca05-4473-a495-51825049a597" (UID: "0e8805fe-ca05-4473-a495-51825049a597"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.205561 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-config-data" (OuterVolumeSpecName: "config-data") pod "0e8805fe-ca05-4473-a495-51825049a597" (UID: "0e8805fe-ca05-4473-a495-51825049a597"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.206044 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0e8805fe-ca05-4473-a495-51825049a597" (UID: "0e8805fe-ca05-4473-a495-51825049a597"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.208537 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0e8805fe-ca05-4473-a495-51825049a597" (UID: "0e8805fe-ca05-4473-a495-51825049a597"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.225040 4778 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.225295 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.225398 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.225463 4778 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e8805fe-ca05-4473-a495-51825049a597-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.888065 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"0e8805fe-ca05-4473-a495-51825049a597","Type":"ContainerDied","Data":"13a7323d690b0903a332b5214cbbaec26482f6f4e79a1870a0cd5fb31e5a7bdc"} Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.888102 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.888148 4778 scope.go:117] "RemoveContainer" containerID="4d7ffc071fc8e981492fa34ef8460623f6088a38943ae8b307b43d627d7ab2ba" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.917283 4778 scope.go:117] "RemoveContainer" containerID="92f1c32d521a3ad41b7358af7fd8f9efca8bd5c33ea07ba39afbd9efdcc0d083" Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.918238 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:30:56 crc kubenswrapper[4778]: I1205 16:30:56.925132 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:30:57 crc kubenswrapper[4778]: I1205 16:30:57.261484 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e8805fe-ca05-4473-a495-51825049a597" path="/var/lib/kubelet/pods/0e8805fe-ca05-4473-a495-51825049a597/volumes" Dec 05 16:30:57 crc kubenswrapper[4778]: I1205 16:30:57.318964 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchere724-account-delete-54jj6" Dec 05 16:30:57 crc kubenswrapper[4778]: I1205 16:30:57.341987 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/423fd545-0438-4c3e-be37-b48929655ae8-operator-scripts\") pod \"423fd545-0438-4c3e-be37-b48929655ae8\" (UID: \"423fd545-0438-4c3e-be37-b48929655ae8\") " Dec 05 16:30:57 crc kubenswrapper[4778]: I1205 16:30:57.342148 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh498\" (UniqueName: \"kubernetes.io/projected/423fd545-0438-4c3e-be37-b48929655ae8-kube-api-access-zh498\") pod \"423fd545-0438-4c3e-be37-b48929655ae8\" (UID: \"423fd545-0438-4c3e-be37-b48929655ae8\") " Dec 05 16:30:57 crc kubenswrapper[4778]: I1205 16:30:57.342560 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/423fd545-0438-4c3e-be37-b48929655ae8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "423fd545-0438-4c3e-be37-b48929655ae8" (UID: "423fd545-0438-4c3e-be37-b48929655ae8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:30:57 crc kubenswrapper[4778]: I1205 16:30:57.342931 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/423fd545-0438-4c3e-be37-b48929655ae8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:57 crc kubenswrapper[4778]: I1205 16:30:57.349542 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423fd545-0438-4c3e-be37-b48929655ae8-kube-api-access-zh498" (OuterVolumeSpecName: "kube-api-access-zh498") pod "423fd545-0438-4c3e-be37-b48929655ae8" (UID: "423fd545-0438-4c3e-be37-b48929655ae8"). InnerVolumeSpecName "kube-api-access-zh498". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:30:57 crc kubenswrapper[4778]: I1205 16:30:57.443989 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh498\" (UniqueName: \"kubernetes.io/projected/423fd545-0438-4c3e-be37-b48929655ae8-kube-api-access-zh498\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:57 crc kubenswrapper[4778]: I1205 16:30:57.905727 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchere724-account-delete-54jj6" event={"ID":"423fd545-0438-4c3e-be37-b48929655ae8","Type":"ContainerDied","Data":"5a72522d661a8fce8d316b6a642abd8169b7f9cc8d28ac60023116f98d2a306f"} Dec 05 16:30:57 crc kubenswrapper[4778]: I1205 16:30:57.905996 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a72522d661a8fce8d316b6a642abd8169b7f9cc8d28ac60023116f98d2a306f" Dec 05 16:30:57 crc kubenswrapper[4778]: I1205 16:30:57.905795 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchere724-account-delete-54jj6" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.121730 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-ghjd7"] Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.130586 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-ghjd7"] Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.151424 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watchere724-account-delete-54jj6"] Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.159501 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-e724-account-create-update-gtcbp"] Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.170523 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watchere724-account-delete-54jj6"] Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.177608 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-e724-account-create-update-gtcbp"] Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.226480 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-vvs4k"] Dec 05 16:30:59 crc kubenswrapper[4778]: E1205 16:30:59.226844 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.226860 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: E1205 16:30:59.226877 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.226883 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: E1205 16:30:59.226894 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.226899 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: E1205 16:30:59.226912 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423fd545-0438-4c3e-be37-b48929655ae8" containerName="mariadb-account-delete" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.226921 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="423fd545-0438-4c3e-be37-b48929655ae8" containerName="mariadb-account-delete" Dec 05 16:30:59 crc kubenswrapper[4778]: E1205 16:30:59.226930 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8805fe-ca05-4473-a495-51825049a597" containerName="watcher-api" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.226936 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8805fe-ca05-4473-a495-51825049a597" containerName="watcher-api" Dec 05 16:30:59 crc kubenswrapper[4778]: E1205 16:30:59.226942 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8805fe-ca05-4473-a495-51825049a597" containerName="watcher-kuttl-api-log" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.226949 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8805fe-ca05-4473-a495-51825049a597" containerName="watcher-kuttl-api-log" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.227106 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.227117 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.227132 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e8805fe-ca05-4473-a495-51825049a597" containerName="watcher-kuttl-api-log" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.227143 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e8805fe-ca05-4473-a495-51825049a597" containerName="watcher-api" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.227151 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.227156 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.227164 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="423fd545-0438-4c3e-be37-b48929655ae8" containerName="mariadb-account-delete" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.227726 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-vvs4k" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.235903 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-vvs4k"] Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.263505 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="423fd545-0438-4c3e-be37-b48929655ae8" path="/var/lib/kubelet/pods/423fd545-0438-4c3e-be37-b48929655ae8/volumes" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.264188 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75425853-3067-431a-8a1c-49370e8c6516" path="/var/lib/kubelet/pods/75425853-3067-431a-8a1c-49370e8c6516/volumes" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.264870 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca0344fc-bf5f-4b80-a634-857e0851ea08" path="/var/lib/kubelet/pods/ca0344fc-bf5f-4b80-a634-857e0851ea08/volumes" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.270514 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaec35c-570b-4421-86b9-250313dfe459-operator-scripts\") pod \"watcher-db-create-vvs4k\" (UID: \"4eaec35c-570b-4421-86b9-250313dfe459\") " pod="watcher-kuttl-default/watcher-db-create-vvs4k" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.270650 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htttv\" (UniqueName: \"kubernetes.io/projected/4eaec35c-570b-4421-86b9-250313dfe459-kube-api-access-htttv\") pod \"watcher-db-create-vvs4k\" (UID: \"4eaec35c-570b-4421-86b9-250313dfe459\") " pod="watcher-kuttl-default/watcher-db-create-vvs4k" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.319765 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb"] Dec 05 16:30:59 crc kubenswrapper[4778]: E1205 16:30:59.320247 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.320268 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: E1205 16:30:59.320289 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.320296 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: E1205 16:30:59.320310 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.320320 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.320534 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.320555 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d171fd-e9d8-4778-917f-ccfad7c27404" containerName="watcher-decision-engine" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.321169 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.328821 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.381910 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb"] Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.384006 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaec35c-570b-4421-86b9-250313dfe459-operator-scripts\") pod \"watcher-db-create-vvs4k\" (UID: \"4eaec35c-570b-4421-86b9-250313dfe459\") " pod="watcher-kuttl-default/watcher-db-create-vvs4k" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.384928 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htttv\" (UniqueName: \"kubernetes.io/projected/4eaec35c-570b-4421-86b9-250313dfe459-kube-api-access-htttv\") pod \"watcher-db-create-vvs4k\" (UID: \"4eaec35c-570b-4421-86b9-250313dfe459\") " pod="watcher-kuttl-default/watcher-db-create-vvs4k" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.385143 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaec35c-570b-4421-86b9-250313dfe459-operator-scripts\") pod \"watcher-db-create-vvs4k\" (UID: \"4eaec35c-570b-4421-86b9-250313dfe459\") " pod="watcher-kuttl-default/watcher-db-create-vvs4k" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.430492 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htttv\" (UniqueName: \"kubernetes.io/projected/4eaec35c-570b-4421-86b9-250313dfe459-kube-api-access-htttv\") pod \"watcher-db-create-vvs4k\" (UID: \"4eaec35c-570b-4421-86b9-250313dfe459\") " pod="watcher-kuttl-default/watcher-db-create-vvs4k" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.486349 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2slqj\" (UniqueName: \"kubernetes.io/projected/ec32a4a4-dcba-4160-8acf-f9910c62c6b1-kube-api-access-2slqj\") pod \"watcher-3a3b-account-create-update-srfdb\" (UID: \"ec32a4a4-dcba-4160-8acf-f9910c62c6b1\") " pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.486449 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec32a4a4-dcba-4160-8acf-f9910c62c6b1-operator-scripts\") pod \"watcher-3a3b-account-create-update-srfdb\" (UID: \"ec32a4a4-dcba-4160-8acf-f9910c62c6b1\") " pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.495064 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.553011 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-vvs4k" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.588021 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec32a4a4-dcba-4160-8acf-f9910c62c6b1-operator-scripts\") pod \"watcher-3a3b-account-create-update-srfdb\" (UID: \"ec32a4a4-dcba-4160-8acf-f9910c62c6b1\") " pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.588613 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2slqj\" (UniqueName: \"kubernetes.io/projected/ec32a4a4-dcba-4160-8acf-f9910c62c6b1-kube-api-access-2slqj\") pod \"watcher-3a3b-account-create-update-srfdb\" (UID: \"ec32a4a4-dcba-4160-8acf-f9910c62c6b1\") " pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.589154 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec32a4a4-dcba-4160-8acf-f9910c62c6b1-operator-scripts\") pod \"watcher-3a3b-account-create-update-srfdb\" (UID: \"ec32a4a4-dcba-4160-8acf-f9910c62c6b1\") " pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.605939 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2slqj\" (UniqueName: \"kubernetes.io/projected/ec32a4a4-dcba-4160-8acf-f9910c62c6b1-kube-api-access-2slqj\") pod \"watcher-3a3b-account-create-update-srfdb\" (UID: \"ec32a4a4-dcba-4160-8acf-f9910c62c6b1\") " pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.689914 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-585xh\" (UniqueName: \"kubernetes.io/projected/828b1128-474f-4ce3-a67d-b0f9ac493824-kube-api-access-585xh\") pod \"828b1128-474f-4ce3-a67d-b0f9ac493824\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.689981 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/828b1128-474f-4ce3-a67d-b0f9ac493824-logs\") pod \"828b1128-474f-4ce3-a67d-b0f9ac493824\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.690036 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828b1128-474f-4ce3-a67d-b0f9ac493824-combined-ca-bundle\") pod \"828b1128-474f-4ce3-a67d-b0f9ac493824\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.690110 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828b1128-474f-4ce3-a67d-b0f9ac493824-config-data\") pod \"828b1128-474f-4ce3-a67d-b0f9ac493824\" (UID: \"828b1128-474f-4ce3-a67d-b0f9ac493824\") " Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.692021 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/828b1128-474f-4ce3-a67d-b0f9ac493824-logs" (OuterVolumeSpecName: "logs") pod "828b1128-474f-4ce3-a67d-b0f9ac493824" (UID: "828b1128-474f-4ce3-a67d-b0f9ac493824"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.708758 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/828b1128-474f-4ce3-a67d-b0f9ac493824-kube-api-access-585xh" (OuterVolumeSpecName: "kube-api-access-585xh") pod "828b1128-474f-4ce3-a67d-b0f9ac493824" (UID: "828b1128-474f-4ce3-a67d-b0f9ac493824"). InnerVolumeSpecName "kube-api-access-585xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.716715 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.721345 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828b1128-474f-4ce3-a67d-b0f9ac493824-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "828b1128-474f-4ce3-a67d-b0f9ac493824" (UID: "828b1128-474f-4ce3-a67d-b0f9ac493824"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.753441 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828b1128-474f-4ce3-a67d-b0f9ac493824-config-data" (OuterVolumeSpecName: "config-data") pod "828b1128-474f-4ce3-a67d-b0f9ac493824" (UID: "828b1128-474f-4ce3-a67d-b0f9ac493824"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.791444 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-585xh\" (UniqueName: \"kubernetes.io/projected/828b1128-474f-4ce3-a67d-b0f9ac493824-kube-api-access-585xh\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.791482 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/828b1128-474f-4ce3-a67d-b0f9ac493824-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.791492 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828b1128-474f-4ce3-a67d-b0f9ac493824-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.791500 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828b1128-474f-4ce3-a67d-b0f9ac493824-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.924448 4778 generic.go:334] "Generic (PLEG): container finished" podID="828b1128-474f-4ce3-a67d-b0f9ac493824" containerID="77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a" exitCode=0 Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.924488 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"828b1128-474f-4ce3-a67d-b0f9ac493824","Type":"ContainerDied","Data":"77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a"} Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.924513 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"828b1128-474f-4ce3-a67d-b0f9ac493824","Type":"ContainerDied","Data":"3922efa06ca427c639b8f4a3a46726d71667cd834dfd50a5862d1a02c49ced25"} Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.924528 4778 scope.go:117] "RemoveContainer" containerID="77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.924630 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.947972 4778 scope.go:117] "RemoveContainer" containerID="77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a" Dec 05 16:30:59 crc kubenswrapper[4778]: E1205 16:30:59.949049 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a\": container with ID starting with 77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a not found: ID does not exist" containerID="77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.949093 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a"} err="failed to get container status \"77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a\": rpc error: code = NotFound desc = could not find container \"77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a\": container with ID starting with 77fdea1657a5ee0091d7984fd0bf1038a97eb48f446dd1527721cb86bb62db0a not found: ID does not exist" Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.965440 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:30:59 crc kubenswrapper[4778]: I1205 16:30:59.971141 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:31:00 crc kubenswrapper[4778]: I1205 16:30:59.999310 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-vvs4k"] Dec 05 16:31:00 crc kubenswrapper[4778]: I1205 16:31:00.169527 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb"] Dec 05 16:31:00 crc kubenswrapper[4778]: W1205 16:31:00.177557 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec32a4a4_dcba_4160_8acf_f9910c62c6b1.slice/crio-23bf4e17707cef60636fa43957ff6783711eb8900de3b1c0454688e767a501c9 WatchSource:0}: Error finding container 23bf4e17707cef60636fa43957ff6783711eb8900de3b1c0454688e767a501c9: Status 404 returned error can't find the container with id 23bf4e17707cef60636fa43957ff6783711eb8900de3b1c0454688e767a501c9 Dec 05 16:31:00 crc kubenswrapper[4778]: E1205 16:31:00.915839 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec32a4a4_dcba_4160_8acf_f9910c62c6b1.slice/crio-conmon-388d96eda6e3b2d009e48735e676c2b1521389e5f7b25e5ad22cab702b7cd001.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:31:00 crc kubenswrapper[4778]: I1205 16:31:00.932589 4778 generic.go:334] "Generic (PLEG): container finished" podID="4eaec35c-570b-4421-86b9-250313dfe459" containerID="6ac7b39e755a82306960e7fc086adaa8c4d26755dbe978f26ed5cee5d0504fcf" exitCode=0 Dec 05 16:31:00 crc kubenswrapper[4778]: I1205 16:31:00.932659 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-vvs4k" event={"ID":"4eaec35c-570b-4421-86b9-250313dfe459","Type":"ContainerDied","Data":"6ac7b39e755a82306960e7fc086adaa8c4d26755dbe978f26ed5cee5d0504fcf"} Dec 05 16:31:00 crc kubenswrapper[4778]: I1205 16:31:00.932767 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-vvs4k" event={"ID":"4eaec35c-570b-4421-86b9-250313dfe459","Type":"ContainerStarted","Data":"c3cdd04b2a831ca6f41e9a0c761a19163536b98be527f84773d9b490cdd7c23d"} Dec 05 16:31:00 crc kubenswrapper[4778]: I1205 16:31:00.936241 4778 generic.go:334] "Generic (PLEG): container finished" podID="ec32a4a4-dcba-4160-8acf-f9910c62c6b1" containerID="388d96eda6e3b2d009e48735e676c2b1521389e5f7b25e5ad22cab702b7cd001" exitCode=0 Dec 05 16:31:00 crc kubenswrapper[4778]: I1205 16:31:00.936273 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" event={"ID":"ec32a4a4-dcba-4160-8acf-f9910c62c6b1","Type":"ContainerDied","Data":"388d96eda6e3b2d009e48735e676c2b1521389e5f7b25e5ad22cab702b7cd001"} Dec 05 16:31:00 crc kubenswrapper[4778]: I1205 16:31:00.936301 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" event={"ID":"ec32a4a4-dcba-4160-8acf-f9910c62c6b1","Type":"ContainerStarted","Data":"23bf4e17707cef60636fa43957ff6783711eb8900de3b1c0454688e767a501c9"} Dec 05 16:31:01 crc kubenswrapper[4778]: I1205 16:31:01.261490 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="828b1128-474f-4ce3-a67d-b0f9ac493824" path="/var/lib/kubelet/pods/828b1128-474f-4ce3-a67d-b0f9ac493824/volumes" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.399215 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-vvs4k" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.410298 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.539654 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec32a4a4-dcba-4160-8acf-f9910c62c6b1-operator-scripts\") pod \"ec32a4a4-dcba-4160-8acf-f9910c62c6b1\" (UID: \"ec32a4a4-dcba-4160-8acf-f9910c62c6b1\") " Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.539698 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2slqj\" (UniqueName: \"kubernetes.io/projected/ec32a4a4-dcba-4160-8acf-f9910c62c6b1-kube-api-access-2slqj\") pod \"ec32a4a4-dcba-4160-8acf-f9910c62c6b1\" (UID: \"ec32a4a4-dcba-4160-8acf-f9910c62c6b1\") " Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.539764 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htttv\" (UniqueName: \"kubernetes.io/projected/4eaec35c-570b-4421-86b9-250313dfe459-kube-api-access-htttv\") pod \"4eaec35c-570b-4421-86b9-250313dfe459\" (UID: \"4eaec35c-570b-4421-86b9-250313dfe459\") " Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.539890 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaec35c-570b-4421-86b9-250313dfe459-operator-scripts\") pod \"4eaec35c-570b-4421-86b9-250313dfe459\" (UID: \"4eaec35c-570b-4421-86b9-250313dfe459\") " Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.540638 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eaec35c-570b-4421-86b9-250313dfe459-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4eaec35c-570b-4421-86b9-250313dfe459" (UID: "4eaec35c-570b-4421-86b9-250313dfe459"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.540695 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec32a4a4-dcba-4160-8acf-f9910c62c6b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec32a4a4-dcba-4160-8acf-f9910c62c6b1" (UID: "ec32a4a4-dcba-4160-8acf-f9910c62c6b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.540865 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaec35c-570b-4421-86b9-250313dfe459-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.540884 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec32a4a4-dcba-4160-8acf-f9910c62c6b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.545335 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eaec35c-570b-4421-86b9-250313dfe459-kube-api-access-htttv" (OuterVolumeSpecName: "kube-api-access-htttv") pod "4eaec35c-570b-4421-86b9-250313dfe459" (UID: "4eaec35c-570b-4421-86b9-250313dfe459"). InnerVolumeSpecName "kube-api-access-htttv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.546530 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec32a4a4-dcba-4160-8acf-f9910c62c6b1-kube-api-access-2slqj" (OuterVolumeSpecName: "kube-api-access-2slqj") pod "ec32a4a4-dcba-4160-8acf-f9910c62c6b1" (UID: "ec32a4a4-dcba-4160-8acf-f9910c62c6b1"). InnerVolumeSpecName "kube-api-access-2slqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.642043 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2slqj\" (UniqueName: \"kubernetes.io/projected/ec32a4a4-dcba-4160-8acf-f9910c62c6b1-kube-api-access-2slqj\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.642468 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htttv\" (UniqueName: \"kubernetes.io/projected/4eaec35c-570b-4421-86b9-250313dfe459-kube-api-access-htttv\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.955667 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" event={"ID":"ec32a4a4-dcba-4160-8acf-f9910c62c6b1","Type":"ContainerDied","Data":"23bf4e17707cef60636fa43957ff6783711eb8900de3b1c0454688e767a501c9"} Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.955705 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23bf4e17707cef60636fa43957ff6783711eb8900de3b1c0454688e767a501c9" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.956038 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.957502 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-vvs4k" event={"ID":"4eaec35c-570b-4421-86b9-250313dfe459","Type":"ContainerDied","Data":"c3cdd04b2a831ca6f41e9a0c761a19163536b98be527f84773d9b490cdd7c23d"} Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.957523 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3cdd04b2a831ca6f41e9a0c761a19163536b98be527f84773d9b490cdd7c23d" Dec 05 16:31:02 crc kubenswrapper[4778]: I1205 16:31:02.957598 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-vvs4k" Dec 05 16:31:03 crc kubenswrapper[4778]: I1205 16:31:03.414536 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:31:03 crc kubenswrapper[4778]: I1205 16:31:03.414608 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.572394 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd"] Dec 05 16:31:04 crc kubenswrapper[4778]: E1205 16:31:04.572780 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eaec35c-570b-4421-86b9-250313dfe459" containerName="mariadb-database-create" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.572797 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eaec35c-570b-4421-86b9-250313dfe459" containerName="mariadb-database-create" Dec 05 16:31:04 crc kubenswrapper[4778]: E1205 16:31:04.572823 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec32a4a4-dcba-4160-8acf-f9910c62c6b1" containerName="mariadb-account-create-update" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.572831 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec32a4a4-dcba-4160-8acf-f9910c62c6b1" containerName="mariadb-account-create-update" Dec 05 16:31:04 crc kubenswrapper[4778]: E1205 16:31:04.572852 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="828b1128-474f-4ce3-a67d-b0f9ac493824" containerName="watcher-applier" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.572861 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="828b1128-474f-4ce3-a67d-b0f9ac493824" containerName="watcher-applier" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.573037 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eaec35c-570b-4421-86b9-250313dfe459" containerName="mariadb-database-create" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.573066 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec32a4a4-dcba-4160-8acf-f9910c62c6b1" containerName="mariadb-account-create-update" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.573077 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="828b1128-474f-4ce3-a67d-b0f9ac493824" containerName="watcher-applier" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.573854 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.576383 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-zgls7" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.576614 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.582919 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd"] Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.679115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-config-data\") pod \"watcher-kuttl-db-sync-nv5qd\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.679279 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbx6s\" (UniqueName: \"kubernetes.io/projected/879c7499-2da6-4974-a832-e969d58d34c3-kube-api-access-nbx6s\") pod \"watcher-kuttl-db-sync-nv5qd\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.680146 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-db-sync-config-data\") pod \"watcher-kuttl-db-sync-nv5qd\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.680247 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-nv5qd\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.782002 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-db-sync-config-data\") pod \"watcher-kuttl-db-sync-nv5qd\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.782606 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-nv5qd\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.782925 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-config-data\") pod \"watcher-kuttl-db-sync-nv5qd\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.783180 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbx6s\" (UniqueName: \"kubernetes.io/projected/879c7499-2da6-4974-a832-e969d58d34c3-kube-api-access-nbx6s\") pod \"watcher-kuttl-db-sync-nv5qd\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.786655 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-db-sync-config-data\") pod \"watcher-kuttl-db-sync-nv5qd\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.801273 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-config-data\") pod \"watcher-kuttl-db-sync-nv5qd\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.806603 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbx6s\" (UniqueName: \"kubernetes.io/projected/879c7499-2da6-4974-a832-e969d58d34c3-kube-api-access-nbx6s\") pod \"watcher-kuttl-db-sync-nv5qd\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.812112 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-nv5qd\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:04 crc kubenswrapper[4778]: I1205 16:31:04.900980 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:05 crc kubenswrapper[4778]: I1205 16:31:05.496591 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd"] Dec 05 16:31:05 crc kubenswrapper[4778]: I1205 16:31:05.984501 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" event={"ID":"879c7499-2da6-4974-a832-e969d58d34c3","Type":"ContainerStarted","Data":"7160fe385086b3b87f7c8ddeeef79fe328ce041405f1121952ad3a8ba05ca9ce"} Dec 05 16:31:05 crc kubenswrapper[4778]: I1205 16:31:05.984793 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" event={"ID":"879c7499-2da6-4974-a832-e969d58d34c3","Type":"ContainerStarted","Data":"b4f3ac3985ebf759fd12cb1f452a279a6f9ed6ddd4db6da3580fa10aed643bf4"} Dec 05 16:31:06 crc kubenswrapper[4778]: I1205 16:31:06.004758 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" podStartSLOduration=2.004735534 podStartE2EDuration="2.004735534s" podCreationTimestamp="2025-12-05 16:31:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:31:06.001464876 +0000 UTC m=+2153.105261256" watchObservedRunningTime="2025-12-05 16:31:06.004735534 +0000 UTC m=+2153.108531914" Dec 05 16:31:09 crc kubenswrapper[4778]: I1205 16:31:09.007546 4778 generic.go:334] "Generic (PLEG): container finished" podID="879c7499-2da6-4974-a832-e969d58d34c3" containerID="7160fe385086b3b87f7c8ddeeef79fe328ce041405f1121952ad3a8ba05ca9ce" exitCode=0 Dec 05 16:31:09 crc kubenswrapper[4778]: I1205 16:31:09.007640 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" event={"ID":"879c7499-2da6-4974-a832-e969d58d34c3","Type":"ContainerDied","Data":"7160fe385086b3b87f7c8ddeeef79fe328ce041405f1121952ad3a8ba05ca9ce"} Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.306723 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jsnqq"] Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.309167 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.320958 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jsnqq"] Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.390788 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.476439 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-utilities\") pod \"community-operators-jsnqq\" (UID: \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\") " pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.476576 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-catalog-content\") pod \"community-operators-jsnqq\" (UID: \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\") " pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.476832 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfvkj\" (UniqueName: \"kubernetes.io/projected/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-kube-api-access-xfvkj\") pod \"community-operators-jsnqq\" (UID: \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\") " pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.578081 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-db-sync-config-data\") pod \"879c7499-2da6-4974-a832-e969d58d34c3\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.578440 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbx6s\" (UniqueName: \"kubernetes.io/projected/879c7499-2da6-4974-a832-e969d58d34c3-kube-api-access-nbx6s\") pod \"879c7499-2da6-4974-a832-e969d58d34c3\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.578636 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-combined-ca-bundle\") pod \"879c7499-2da6-4974-a832-e969d58d34c3\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.578741 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-config-data\") pod \"879c7499-2da6-4974-a832-e969d58d34c3\" (UID: \"879c7499-2da6-4974-a832-e969d58d34c3\") " Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.578986 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-utilities\") pod \"community-operators-jsnqq\" (UID: \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\") " pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.579091 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-catalog-content\") pod \"community-operators-jsnqq\" (UID: \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\") " pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.579283 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfvkj\" (UniqueName: \"kubernetes.io/projected/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-kube-api-access-xfvkj\") pod \"community-operators-jsnqq\" (UID: \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\") " pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.579391 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-catalog-content\") pod \"community-operators-jsnqq\" (UID: \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\") " pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.579338 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-utilities\") pod \"community-operators-jsnqq\" (UID: \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\") " pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.583699 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/879c7499-2da6-4974-a832-e969d58d34c3-kube-api-access-nbx6s" (OuterVolumeSpecName: "kube-api-access-nbx6s") pod "879c7499-2da6-4974-a832-e969d58d34c3" (UID: "879c7499-2da6-4974-a832-e969d58d34c3"). InnerVolumeSpecName "kube-api-access-nbx6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.589600 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "879c7499-2da6-4974-a832-e969d58d34c3" (UID: "879c7499-2da6-4974-a832-e969d58d34c3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.603063 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfvkj\" (UniqueName: \"kubernetes.io/projected/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-kube-api-access-xfvkj\") pod \"community-operators-jsnqq\" (UID: \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\") " pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.606655 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "879c7499-2da6-4974-a832-e969d58d34c3" (UID: "879c7499-2da6-4974-a832-e969d58d34c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.620832 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-config-data" (OuterVolumeSpecName: "config-data") pod "879c7499-2da6-4974-a832-e969d58d34c3" (UID: "879c7499-2da6-4974-a832-e969d58d34c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.682010 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.682075 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbx6s\" (UniqueName: \"kubernetes.io/projected/879c7499-2da6-4974-a832-e969d58d34c3-kube-api-access-nbx6s\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.682095 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.682111 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/879c7499-2da6-4974-a832-e969d58d34c3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:10 crc kubenswrapper[4778]: I1205 16:31:10.708790 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.030608 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" event={"ID":"879c7499-2da6-4974-a832-e969d58d34c3","Type":"ContainerDied","Data":"b4f3ac3985ebf759fd12cb1f452a279a6f9ed6ddd4db6da3580fa10aed643bf4"} Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.031104 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4f3ac3985ebf759fd12cb1f452a279a6f9ed6ddd4db6da3580fa10aed643bf4" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.031182 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.267844 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:31:11 crc kubenswrapper[4778]: E1205 16:31:11.268222 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879c7499-2da6-4974-a832-e969d58d34c3" containerName="watcher-kuttl-db-sync" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.268241 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="879c7499-2da6-4974-a832-e969d58d34c3" containerName="watcher-kuttl-db-sync" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.268459 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="879c7499-2da6-4974-a832-e969d58d34c3" containerName="watcher-kuttl-db-sync" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.269095 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.275989 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.279327 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: E1205 16:31:11.280963 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod879c7499_2da6_4974_a832_e969d58d34c3.slice/crio-b4f3ac3985ebf759fd12cb1f452a279a6f9ed6ddd4db6da3580fa10aed643bf4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod879c7499_2da6_4974_a832_e969d58d34c3.slice\": RecentStats: unable to find data in memory cache]" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.297273 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.297818 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-zgls7" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.300589 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jsnqq"] Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.314852 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.315138 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.315250 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.355426 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.365978 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.376529 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.383129 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.387874 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.395069 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2f4c\" (UniqueName: \"kubernetes.io/projected/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-kube-api-access-z2f4c\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.395120 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-logs\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.395162 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.395207 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.395245 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.395277 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.395308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.395614 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52bb4\" (UniqueName: \"kubernetes.io/projected/f83b7df7-adc4-4c29-8805-a7c99abc7f29-kube-api-access-52bb4\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.395647 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.395675 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.395695 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83b7df7-adc4-4c29-8805-a7c99abc7f29-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.395804 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.406822 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496685 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2f4c\" (UniqueName: \"kubernetes.io/projected/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-kube-api-access-z2f4c\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496730 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-logs\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496754 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496774 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496805 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496824 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496846 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496868 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d6a2d36-d40e-4057-9fad-7995792e6351-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496884 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5nll\" (UniqueName: \"kubernetes.io/projected/5d6a2d36-d40e-4057-9fad-7995792e6351-kube-api-access-w5nll\") pod \"watcher-kuttl-applier-0\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496904 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6a2d36-d40e-4057-9fad-7995792e6351-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496929 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52bb4\" (UniqueName: \"kubernetes.io/projected/f83b7df7-adc4-4c29-8805-a7c99abc7f29-kube-api-access-52bb4\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496947 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496963 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.496977 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83b7df7-adc4-4c29-8805-a7c99abc7f29-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.497002 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6a2d36-d40e-4057-9fad-7995792e6351-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.497063 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.498701 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-logs\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.500832 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83b7df7-adc4-4c29-8805-a7c99abc7f29-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.506483 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.507197 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.511382 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.511891 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.515872 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.519117 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.519955 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.520349 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.523336 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2f4c\" (UniqueName: \"kubernetes.io/projected/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-kube-api-access-z2f4c\") pod \"watcher-kuttl-api-0\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.528815 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52bb4\" (UniqueName: \"kubernetes.io/projected/f83b7df7-adc4-4c29-8805-a7c99abc7f29-kube-api-access-52bb4\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.598625 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d6a2d36-d40e-4057-9fad-7995792e6351-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.598667 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5nll\" (UniqueName: \"kubernetes.io/projected/5d6a2d36-d40e-4057-9fad-7995792e6351-kube-api-access-w5nll\") pod \"watcher-kuttl-applier-0\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.598695 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6a2d36-d40e-4057-9fad-7995792e6351-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.598745 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6a2d36-d40e-4057-9fad-7995792e6351-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.599112 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d6a2d36-d40e-4057-9fad-7995792e6351-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.602515 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6a2d36-d40e-4057-9fad-7995792e6351-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.603095 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6a2d36-d40e-4057-9fad-7995792e6351-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.613902 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5nll\" (UniqueName: \"kubernetes.io/projected/5d6a2d36-d40e-4057-9fad-7995792e6351-kube-api-access-w5nll\") pod \"watcher-kuttl-applier-0\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.615743 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.725781 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:11 crc kubenswrapper[4778]: I1205 16:31:11.750377 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:12 crc kubenswrapper[4778]: I1205 16:31:12.040591 4778 generic.go:334] "Generic (PLEG): container finished" podID="59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" containerID="35b5a7d2e19ae7b8428f0e70ff7acf8f091e48835e1d92aa021105a73ae8b3eb" exitCode=0 Dec 05 16:31:12 crc kubenswrapper[4778]: I1205 16:31:12.041143 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsnqq" event={"ID":"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5","Type":"ContainerDied","Data":"35b5a7d2e19ae7b8428f0e70ff7acf8f091e48835e1d92aa021105a73ae8b3eb"} Dec 05 16:31:12 crc kubenswrapper[4778]: I1205 16:31:12.041180 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsnqq" event={"ID":"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5","Type":"ContainerStarted","Data":"7767eb3d3624cf957262f9ea04c1fed27c296f4a7005f9f4cdcb98f5666df8ae"} Dec 05 16:31:12 crc kubenswrapper[4778]: I1205 16:31:12.110051 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:31:12 crc kubenswrapper[4778]: W1205 16:31:12.120727 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf83b7df7_adc4_4c29_8805_a7c99abc7f29.slice/crio-3fa410e0bc6579e5f0d5b2a995f81cc7524ad4e69d70b60acbd4d1bfc87ede99 WatchSource:0}: Error finding container 3fa410e0bc6579e5f0d5b2a995f81cc7524ad4e69d70b60acbd4d1bfc87ede99: Status 404 returned error can't find the container with id 3fa410e0bc6579e5f0d5b2a995f81cc7524ad4e69d70b60acbd4d1bfc87ede99 Dec 05 16:31:12 crc kubenswrapper[4778]: I1205 16:31:12.226818 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:31:12 crc kubenswrapper[4778]: W1205 16:31:12.230015 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d6a2d36_d40e_4057_9fad_7995792e6351.slice/crio-adab532763a91922a2129cf543dfe224758b6c2386dbd98862b845389cb2ec3e WatchSource:0}: Error finding container adab532763a91922a2129cf543dfe224758b6c2386dbd98862b845389cb2ec3e: Status 404 returned error can't find the container with id adab532763a91922a2129cf543dfe224758b6c2386dbd98862b845389cb2ec3e Dec 05 16:31:12 crc kubenswrapper[4778]: I1205 16:31:12.241821 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:31:12 crc kubenswrapper[4778]: W1205 16:31:12.255450 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfae1f369_b35f_440d_9c9a_1319dd8a1dcd.slice/crio-e59227d4457a6eb3e19f8dd2369c5e28106c7bfa38ddd07e5bb6469df58ecf8c WatchSource:0}: Error finding container e59227d4457a6eb3e19f8dd2369c5e28106c7bfa38ddd07e5bb6469df58ecf8c: Status 404 returned error can't find the container with id e59227d4457a6eb3e19f8dd2369c5e28106c7bfa38ddd07e5bb6469df58ecf8c Dec 05 16:31:13 crc kubenswrapper[4778]: I1205 16:31:13.049457 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"5d6a2d36-d40e-4057-9fad-7995792e6351","Type":"ContainerStarted","Data":"3d1e99de35edfa2cdfc436bdd862bf05a87012ccc513d115875bd95886b85384"} Dec 05 16:31:13 crc kubenswrapper[4778]: I1205 16:31:13.050051 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"5d6a2d36-d40e-4057-9fad-7995792e6351","Type":"ContainerStarted","Data":"adab532763a91922a2129cf543dfe224758b6c2386dbd98862b845389cb2ec3e"} Dec 05 16:31:13 crc kubenswrapper[4778]: I1205 16:31:13.054008 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"fae1f369-b35f-440d-9c9a-1319dd8a1dcd","Type":"ContainerStarted","Data":"ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34"} Dec 05 16:31:13 crc kubenswrapper[4778]: I1205 16:31:13.054051 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"fae1f369-b35f-440d-9c9a-1319dd8a1dcd","Type":"ContainerStarted","Data":"6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b"} Dec 05 16:31:13 crc kubenswrapper[4778]: I1205 16:31:13.054063 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"fae1f369-b35f-440d-9c9a-1319dd8a1dcd","Type":"ContainerStarted","Data":"e59227d4457a6eb3e19f8dd2369c5e28106c7bfa38ddd07e5bb6469df58ecf8c"} Dec 05 16:31:13 crc kubenswrapper[4778]: I1205 16:31:13.055047 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:13 crc kubenswrapper[4778]: I1205 16:31:13.059928 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsnqq" event={"ID":"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5","Type":"ContainerStarted","Data":"16bb305ced4db7cef60ff009c38c89585cd9c320ebe75b3ac5024affbfbe8fe1"} Dec 05 16:31:13 crc kubenswrapper[4778]: I1205 16:31:13.062129 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerStarted","Data":"6a14a551ab136c26201824b119910f4694308c1463f322ac96f7bab64b7656ae"} Dec 05 16:31:13 crc kubenswrapper[4778]: I1205 16:31:13.062160 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerStarted","Data":"3fa410e0bc6579e5f0d5b2a995f81cc7524ad4e69d70b60acbd4d1bfc87ede99"} Dec 05 16:31:13 crc kubenswrapper[4778]: I1205 16:31:13.078334 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.078313394 podStartE2EDuration="2.078313394s" podCreationTimestamp="2025-12-05 16:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:31:13.074199213 +0000 UTC m=+2160.177995613" watchObservedRunningTime="2025-12-05 16:31:13.078313394 +0000 UTC m=+2160.182109774" Dec 05 16:31:13 crc kubenswrapper[4778]: I1205 16:31:13.142071 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.142052864 podStartE2EDuration="2.142052864s" podCreationTimestamp="2025-12-05 16:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:31:13.137514823 +0000 UTC m=+2160.241311203" watchObservedRunningTime="2025-12-05 16:31:13.142052864 +0000 UTC m=+2160.245849244" Dec 05 16:31:14 crc kubenswrapper[4778]: I1205 16:31:14.073204 4778 generic.go:334] "Generic (PLEG): container finished" podID="59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" containerID="16bb305ced4db7cef60ff009c38c89585cd9c320ebe75b3ac5024affbfbe8fe1" exitCode=0 Dec 05 16:31:14 crc kubenswrapper[4778]: I1205 16:31:14.073333 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsnqq" event={"ID":"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5","Type":"ContainerDied","Data":"16bb305ced4db7cef60ff009c38c89585cd9c320ebe75b3ac5024affbfbe8fe1"} Dec 05 16:31:14 crc kubenswrapper[4778]: I1205 16:31:14.099464 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=3.099447575 podStartE2EDuration="3.099447575s" podCreationTimestamp="2025-12-05 16:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:31:13.199896016 +0000 UTC m=+2160.303692396" watchObservedRunningTime="2025-12-05 16:31:14.099447575 +0000 UTC m=+2161.203243955" Dec 05 16:31:15 crc kubenswrapper[4778]: I1205 16:31:15.082998 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 16:31:15 crc kubenswrapper[4778]: I1205 16:31:15.083008 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsnqq" event={"ID":"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5","Type":"ContainerStarted","Data":"3604f169d40bdf7703193eedf353a0d6ea67959d49c9d8327f1e7ea3d973d4b6"} Dec 05 16:31:15 crc kubenswrapper[4778]: I1205 16:31:15.115723 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jsnqq" podStartSLOduration=2.653414783 podStartE2EDuration="5.115704105s" podCreationTimestamp="2025-12-05 16:31:10 +0000 UTC" firstStartedPulling="2025-12-05 16:31:12.049788155 +0000 UTC m=+2159.153584535" lastFinishedPulling="2025-12-05 16:31:14.512077467 +0000 UTC m=+2161.615873857" observedRunningTime="2025-12-05 16:31:15.114680458 +0000 UTC m=+2162.218476848" watchObservedRunningTime="2025-12-05 16:31:15.115704105 +0000 UTC m=+2162.219500485" Dec 05 16:31:15 crc kubenswrapper[4778]: I1205 16:31:15.570091 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:16 crc kubenswrapper[4778]: I1205 16:31:16.093122 4778 generic.go:334] "Generic (PLEG): container finished" podID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerID="6a14a551ab136c26201824b119910f4694308c1463f322ac96f7bab64b7656ae" exitCode=1 Dec 05 16:31:16 crc kubenswrapper[4778]: I1205 16:31:16.093222 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerDied","Data":"6a14a551ab136c26201824b119910f4694308c1463f322ac96f7bab64b7656ae"} Dec 05 16:31:16 crc kubenswrapper[4778]: I1205 16:31:16.093988 4778 scope.go:117] "RemoveContainer" containerID="6a14a551ab136c26201824b119910f4694308c1463f322ac96f7bab64b7656ae" Dec 05 16:31:16 crc kubenswrapper[4778]: I1205 16:31:16.727160 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:16 crc kubenswrapper[4778]: I1205 16:31:16.750693 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:17 crc kubenswrapper[4778]: I1205 16:31:17.101429 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerStarted","Data":"3c3d854d6d59314ddf9938c51275bfe67f5ebf16d78c4c1567f2e22e7c0a1832"} Dec 05 16:31:20 crc kubenswrapper[4778]: I1205 16:31:20.128189 4778 generic.go:334] "Generic (PLEG): container finished" podID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerID="3c3d854d6d59314ddf9938c51275bfe67f5ebf16d78c4c1567f2e22e7c0a1832" exitCode=1 Dec 05 16:31:20 crc kubenswrapper[4778]: I1205 16:31:20.128270 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerDied","Data":"3c3d854d6d59314ddf9938c51275bfe67f5ebf16d78c4c1567f2e22e7c0a1832"} Dec 05 16:31:20 crc kubenswrapper[4778]: I1205 16:31:20.128571 4778 scope.go:117] "RemoveContainer" containerID="6a14a551ab136c26201824b119910f4694308c1463f322ac96f7bab64b7656ae" Dec 05 16:31:20 crc kubenswrapper[4778]: I1205 16:31:20.129282 4778 scope.go:117] "RemoveContainer" containerID="3c3d854d6d59314ddf9938c51275bfe67f5ebf16d78c4c1567f2e22e7c0a1832" Dec 05 16:31:20 crc kubenswrapper[4778]: E1205 16:31:20.129624 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:31:20 crc kubenswrapper[4778]: I1205 16:31:20.708974 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:20 crc kubenswrapper[4778]: I1205 16:31:20.709582 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:20 crc kubenswrapper[4778]: I1205 16:31:20.763762 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:21 crc kubenswrapper[4778]: I1205 16:31:21.179401 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:21 crc kubenswrapper[4778]: I1205 16:31:21.616885 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:21 crc kubenswrapper[4778]: I1205 16:31:21.616933 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:21 crc kubenswrapper[4778]: I1205 16:31:21.617518 4778 scope.go:117] "RemoveContainer" containerID="3c3d854d6d59314ddf9938c51275bfe67f5ebf16d78c4c1567f2e22e7c0a1832" Dec 05 16:31:21 crc kubenswrapper[4778]: E1205 16:31:21.617782 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:31:21 crc kubenswrapper[4778]: I1205 16:31:21.727615 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:21 crc kubenswrapper[4778]: I1205 16:31:21.743720 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:21 crc kubenswrapper[4778]: I1205 16:31:21.751218 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:21 crc kubenswrapper[4778]: I1205 16:31:21.779383 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:22 crc kubenswrapper[4778]: I1205 16:31:22.155759 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:31:22 crc kubenswrapper[4778]: I1205 16:31:22.173114 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:31:24 crc kubenswrapper[4778]: I1205 16:31:24.296400 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jsnqq"] Dec 05 16:31:24 crc kubenswrapper[4778]: I1205 16:31:24.296918 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jsnqq" podUID="59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" containerName="registry-server" containerID="cri-o://3604f169d40bdf7703193eedf353a0d6ea67959d49c9d8327f1e7ea3d973d4b6" gracePeriod=2 Dec 05 16:31:28 crc kubenswrapper[4778]: I1205 16:31:28.194776 4778 generic.go:334] "Generic (PLEG): container finished" podID="59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" containerID="3604f169d40bdf7703193eedf353a0d6ea67959d49c9d8327f1e7ea3d973d4b6" exitCode=0 Dec 05 16:31:28 crc kubenswrapper[4778]: I1205 16:31:28.194842 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsnqq" event={"ID":"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5","Type":"ContainerDied","Data":"3604f169d40bdf7703193eedf353a0d6ea67959d49c9d8327f1e7ea3d973d4b6"} Dec 05 16:31:28 crc kubenswrapper[4778]: I1205 16:31:28.737792 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:28 crc kubenswrapper[4778]: I1205 16:31:28.763761 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-utilities\") pod \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\" (UID: \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\") " Dec 05 16:31:28 crc kubenswrapper[4778]: I1205 16:31:28.763883 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfvkj\" (UniqueName: \"kubernetes.io/projected/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-kube-api-access-xfvkj\") pod \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\" (UID: \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\") " Dec 05 16:31:28 crc kubenswrapper[4778]: I1205 16:31:28.763921 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-catalog-content\") pod \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\" (UID: \"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5\") " Dec 05 16:31:28 crc kubenswrapper[4778]: I1205 16:31:28.767209 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-utilities" (OuterVolumeSpecName: "utilities") pod "59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" (UID: "59fa2df8-b0ca-4e81-b97e-bf84ece9fef5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:31:28 crc kubenswrapper[4778]: I1205 16:31:28.775322 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-kube-api-access-xfvkj" (OuterVolumeSpecName: "kube-api-access-xfvkj") pod "59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" (UID: "59fa2df8-b0ca-4e81-b97e-bf84ece9fef5"). InnerVolumeSpecName "kube-api-access-xfvkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:31:28 crc kubenswrapper[4778]: I1205 16:31:28.826143 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" (UID: "59fa2df8-b0ca-4e81-b97e-bf84ece9fef5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:31:28 crc kubenswrapper[4778]: I1205 16:31:28.865018 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfvkj\" (UniqueName: \"kubernetes.io/projected/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-kube-api-access-xfvkj\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:28 crc kubenswrapper[4778]: I1205 16:31:28.865054 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:28 crc kubenswrapper[4778]: I1205 16:31:28.865065 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:29 crc kubenswrapper[4778]: I1205 16:31:29.205145 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsnqq" event={"ID":"59fa2df8-b0ca-4e81-b97e-bf84ece9fef5","Type":"ContainerDied","Data":"7767eb3d3624cf957262f9ea04c1fed27c296f4a7005f9f4cdcb98f5666df8ae"} Dec 05 16:31:29 crc kubenswrapper[4778]: I1205 16:31:29.205193 4778 scope.go:117] "RemoveContainer" containerID="3604f169d40bdf7703193eedf353a0d6ea67959d49c9d8327f1e7ea3d973d4b6" Dec 05 16:31:29 crc kubenswrapper[4778]: I1205 16:31:29.205191 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jsnqq" Dec 05 16:31:29 crc kubenswrapper[4778]: I1205 16:31:29.235198 4778 scope.go:117] "RemoveContainer" containerID="16bb305ced4db7cef60ff009c38c89585cd9c320ebe75b3ac5024affbfbe8fe1" Dec 05 16:31:29 crc kubenswrapper[4778]: I1205 16:31:29.270562 4778 scope.go:117] "RemoveContainer" containerID="35b5a7d2e19ae7b8428f0e70ff7acf8f091e48835e1d92aa021105a73ae8b3eb" Dec 05 16:31:29 crc kubenswrapper[4778]: I1205 16:31:29.280280 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jsnqq"] Dec 05 16:31:29 crc kubenswrapper[4778]: I1205 16:31:29.280338 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jsnqq"] Dec 05 16:31:31 crc kubenswrapper[4778]: I1205 16:31:31.287959 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" path="/var/lib/kubelet/pods/59fa2df8-b0ca-4e81-b97e-bf84ece9fef5/volumes" Dec 05 16:31:33 crc kubenswrapper[4778]: I1205 16:31:33.415160 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:31:33 crc kubenswrapper[4778]: I1205 16:31:33.415228 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:31:34 crc kubenswrapper[4778]: I1205 16:31:34.249281 4778 scope.go:117] "RemoveContainer" containerID="3c3d854d6d59314ddf9938c51275bfe67f5ebf16d78c4c1567f2e22e7c0a1832" Dec 05 16:31:37 crc kubenswrapper[4778]: I1205 16:31:37.296562 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerStarted","Data":"d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65"} Dec 05 16:31:41 crc kubenswrapper[4778]: I1205 16:31:41.615906 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:41 crc kubenswrapper[4778]: I1205 16:31:41.616476 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:41 crc kubenswrapper[4778]: E1205 16:31:41.616457 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65 is running failed: container process not found" containerID="d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 16:31:41 crc kubenswrapper[4778]: E1205 16:31:41.616868 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65 is running failed: container process not found" containerID="d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 16:31:41 crc kubenswrapper[4778]: E1205 16:31:41.617249 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65 is running failed: container process not found" containerID="d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 16:31:41 crc kubenswrapper[4778]: E1205 16:31:41.617334 4778 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65 is running failed: container process not found" probeType="Startup" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:31:42 crc kubenswrapper[4778]: I1205 16:31:42.335271 4778 generic.go:334] "Generic (PLEG): container finished" podID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerID="d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65" exitCode=1 Dec 05 16:31:42 crc kubenswrapper[4778]: I1205 16:31:42.335312 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerDied","Data":"d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65"} Dec 05 16:31:42 crc kubenswrapper[4778]: I1205 16:31:42.335343 4778 scope.go:117] "RemoveContainer" containerID="3c3d854d6d59314ddf9938c51275bfe67f5ebf16d78c4c1567f2e22e7c0a1832" Dec 05 16:31:42 crc kubenswrapper[4778]: I1205 16:31:42.336086 4778 scope.go:117] "RemoveContainer" containerID="d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65" Dec 05 16:31:42 crc kubenswrapper[4778]: E1205 16:31:42.336278 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:31:51 crc kubenswrapper[4778]: I1205 16:31:51.616799 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:31:51 crc kubenswrapper[4778]: I1205 16:31:51.618019 4778 scope.go:117] "RemoveContainer" containerID="d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65" Dec 05 16:31:51 crc kubenswrapper[4778]: E1205 16:31:51.618336 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:32:03 crc kubenswrapper[4778]: I1205 16:32:03.415226 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:32:03 crc kubenswrapper[4778]: I1205 16:32:03.415825 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:32:03 crc kubenswrapper[4778]: I1205 16:32:03.415879 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:32:03 crc kubenswrapper[4778]: I1205 16:32:03.416759 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:32:03 crc kubenswrapper[4778]: I1205 16:32:03.416826 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" gracePeriod=600 Dec 05 16:32:03 crc kubenswrapper[4778]: E1205 16:32:03.544332 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:32:04 crc kubenswrapper[4778]: I1205 16:32:04.249893 4778 scope.go:117] "RemoveContainer" containerID="d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65" Dec 05 16:32:04 crc kubenswrapper[4778]: I1205 16:32:04.532002 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerStarted","Data":"936b4d918c62e213d3b9cae56bc2b58e9e9bd6ab3cb2de27eaf0e3ce845e9a79"} Dec 05 16:32:04 crc kubenswrapper[4778]: I1205 16:32:04.535605 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" exitCode=0 Dec 05 16:32:04 crc kubenswrapper[4778]: I1205 16:32:04.535697 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c"} Dec 05 16:32:04 crc kubenswrapper[4778]: I1205 16:32:04.535767 4778 scope.go:117] "RemoveContainer" containerID="07f810bc57816803590a70eb4acea4f15e7b3ff3f3449761e7328bfde876054a" Dec 05 16:32:04 crc kubenswrapper[4778]: I1205 16:32:04.536112 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:32:04 crc kubenswrapper[4778]: E1205 16:32:04.536414 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:32:07 crc kubenswrapper[4778]: I1205 16:32:07.563780 4778 generic.go:334] "Generic (PLEG): container finished" podID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerID="936b4d918c62e213d3b9cae56bc2b58e9e9bd6ab3cb2de27eaf0e3ce845e9a79" exitCode=1 Dec 05 16:32:07 crc kubenswrapper[4778]: I1205 16:32:07.563863 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerDied","Data":"936b4d918c62e213d3b9cae56bc2b58e9e9bd6ab3cb2de27eaf0e3ce845e9a79"} Dec 05 16:32:07 crc kubenswrapper[4778]: I1205 16:32:07.564100 4778 scope.go:117] "RemoveContainer" containerID="d75981e39138fda69b39e06b50b67d32fe5f1e2418c5c28af9f0515920310a65" Dec 05 16:32:07 crc kubenswrapper[4778]: I1205 16:32:07.564691 4778 scope.go:117] "RemoveContainer" containerID="936b4d918c62e213d3b9cae56bc2b58e9e9bd6ab3cb2de27eaf0e3ce845e9a79" Dec 05 16:32:07 crc kubenswrapper[4778]: E1205 16:32:07.565016 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:32:11 crc kubenswrapper[4778]: I1205 16:32:11.616443 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:32:11 crc kubenswrapper[4778]: I1205 16:32:11.616955 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:32:11 crc kubenswrapper[4778]: I1205 16:32:11.616969 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:32:11 crc kubenswrapper[4778]: I1205 16:32:11.616981 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:32:11 crc kubenswrapper[4778]: I1205 16:32:11.617495 4778 scope.go:117] "RemoveContainer" containerID="936b4d918c62e213d3b9cae56bc2b58e9e9bd6ab3cb2de27eaf0e3ce845e9a79" Dec 05 16:32:11 crc kubenswrapper[4778]: E1205 16:32:11.617722 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:32:12 crc kubenswrapper[4778]: I1205 16:32:12.601748 4778 scope.go:117] "RemoveContainer" containerID="936b4d918c62e213d3b9cae56bc2b58e9e9bd6ab3cb2de27eaf0e3ce845e9a79" Dec 05 16:32:12 crc kubenswrapper[4778]: E1205 16:32:12.602314 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:32:16 crc kubenswrapper[4778]: I1205 16:32:16.249790 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:32:16 crc kubenswrapper[4778]: E1205 16:32:16.250286 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:32:23 crc kubenswrapper[4778]: I1205 16:32:23.259139 4778 scope.go:117] "RemoveContainer" containerID="936b4d918c62e213d3b9cae56bc2b58e9e9bd6ab3cb2de27eaf0e3ce845e9a79" Dec 05 16:32:23 crc kubenswrapper[4778]: E1205 16:32:23.260142 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:32:29 crc kubenswrapper[4778]: I1205 16:32:29.249773 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:32:29 crc kubenswrapper[4778]: E1205 16:32:29.250598 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:32:35 crc kubenswrapper[4778]: I1205 16:32:35.250026 4778 scope.go:117] "RemoveContainer" containerID="936b4d918c62e213d3b9cae56bc2b58e9e9bd6ab3cb2de27eaf0e3ce845e9a79" Dec 05 16:32:35 crc kubenswrapper[4778]: E1205 16:32:35.250899 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:32:41 crc kubenswrapper[4778]: I1205 16:32:41.250348 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:32:41 crc kubenswrapper[4778]: E1205 16:32:41.250979 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:32:48 crc kubenswrapper[4778]: I1205 16:32:48.105590 4778 scope.go:117] "RemoveContainer" containerID="ef29064e18100bd35c52e7374d9e4d2ed0ac6790d0745d07b1ed7990591f9f95" Dec 05 16:32:48 crc kubenswrapper[4778]: I1205 16:32:48.125988 4778 scope.go:117] "RemoveContainer" containerID="83d7affd3e3bfd6fd473509b0492f5babc93dd1363a510ade81e81a7eb0b1f54" Dec 05 16:32:48 crc kubenswrapper[4778]: I1205 16:32:48.177701 4778 scope.go:117] "RemoveContainer" containerID="925000619d53bbe63a1831dd13ba622e0aad205d8110b802c077c8630669cf4f" Dec 05 16:32:49 crc kubenswrapper[4778]: I1205 16:32:49.249586 4778 scope.go:117] "RemoveContainer" containerID="936b4d918c62e213d3b9cae56bc2b58e9e9bd6ab3cb2de27eaf0e3ce845e9a79" Dec 05 16:32:49 crc kubenswrapper[4778]: I1205 16:32:49.929728 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerStarted","Data":"ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f"} Dec 05 16:32:51 crc kubenswrapper[4778]: I1205 16:32:51.616045 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:32:51 crc kubenswrapper[4778]: I1205 16:32:51.648437 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:32:51 crc kubenswrapper[4778]: I1205 16:32:51.955199 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:32:51 crc kubenswrapper[4778]: I1205 16:32:51.986896 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:32:52 crc kubenswrapper[4778]: I1205 16:32:52.966876 4778 generic.go:334] "Generic (PLEG): container finished" podID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerID="ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f" exitCode=1 Dec 05 16:32:52 crc kubenswrapper[4778]: I1205 16:32:52.966961 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerDied","Data":"ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f"} Dec 05 16:32:52 crc kubenswrapper[4778]: I1205 16:32:52.967272 4778 scope.go:117] "RemoveContainer" containerID="936b4d918c62e213d3b9cae56bc2b58e9e9bd6ab3cb2de27eaf0e3ce845e9a79" Dec 05 16:32:52 crc kubenswrapper[4778]: I1205 16:32:52.967690 4778 scope.go:117] "RemoveContainer" containerID="ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f" Dec 05 16:32:52 crc kubenswrapper[4778]: E1205 16:32:52.968152 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:32:53 crc kubenswrapper[4778]: I1205 16:32:53.979341 4778 scope.go:117] "RemoveContainer" containerID="ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f" Dec 05 16:32:53 crc kubenswrapper[4778]: E1205 16:32:53.980211 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:32:56 crc kubenswrapper[4778]: I1205 16:32:56.249415 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:32:56 crc kubenswrapper[4778]: E1205 16:32:56.249982 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:33:01 crc kubenswrapper[4778]: I1205 16:33:01.616496 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:33:01 crc kubenswrapper[4778]: I1205 16:33:01.617994 4778 scope.go:117] "RemoveContainer" containerID="ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f" Dec 05 16:33:01 crc kubenswrapper[4778]: E1205 16:33:01.618317 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:33:09 crc kubenswrapper[4778]: I1205 16:33:09.249722 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:33:09 crc kubenswrapper[4778]: E1205 16:33:09.250272 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:33:11 crc kubenswrapper[4778]: I1205 16:33:11.616714 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:33:11 crc kubenswrapper[4778]: I1205 16:33:11.617184 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:33:11 crc kubenswrapper[4778]: I1205 16:33:11.618100 4778 scope.go:117] "RemoveContainer" containerID="ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f" Dec 05 16:33:11 crc kubenswrapper[4778]: E1205 16:33:11.618339 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:33:21 crc kubenswrapper[4778]: I1205 16:33:21.249895 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:33:21 crc kubenswrapper[4778]: E1205 16:33:21.251015 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:33:24 crc kubenswrapper[4778]: I1205 16:33:24.249532 4778 scope.go:117] "RemoveContainer" containerID="ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f" Dec 05 16:33:24 crc kubenswrapper[4778]: E1205 16:33:24.250145 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:33:35 crc kubenswrapper[4778]: I1205 16:33:35.249640 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:33:35 crc kubenswrapper[4778]: E1205 16:33:35.250383 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:33:37 crc kubenswrapper[4778]: I1205 16:33:37.250715 4778 scope.go:117] "RemoveContainer" containerID="ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f" Dec 05 16:33:37 crc kubenswrapper[4778]: E1205 16:33:37.251034 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:33:46 crc kubenswrapper[4778]: I1205 16:33:46.250644 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:33:46 crc kubenswrapper[4778]: E1205 16:33:46.251232 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:33:50 crc kubenswrapper[4778]: I1205 16:33:50.249324 4778 scope.go:117] "RemoveContainer" containerID="ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f" Dec 05 16:33:50 crc kubenswrapper[4778]: E1205 16:33:50.250091 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:34:00 crc kubenswrapper[4778]: I1205 16:34:00.250802 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:34:00 crc kubenswrapper[4778]: E1205 16:34:00.251674 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:34:05 crc kubenswrapper[4778]: I1205 16:34:05.250141 4778 scope.go:117] "RemoveContainer" containerID="ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f" Dec 05 16:34:05 crc kubenswrapper[4778]: E1205 16:34:05.250894 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:34:15 crc kubenswrapper[4778]: I1205 16:34:15.249765 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:34:15 crc kubenswrapper[4778]: E1205 16:34:15.250583 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:34:18 crc kubenswrapper[4778]: I1205 16:34:18.249251 4778 scope.go:117] "RemoveContainer" containerID="ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f" Dec 05 16:34:18 crc kubenswrapper[4778]: I1205 16:34:18.691338 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerStarted","Data":"9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b"} Dec 05 16:34:21 crc kubenswrapper[4778]: I1205 16:34:21.616698 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:34:21 crc kubenswrapper[4778]: E1205 16:34:21.617236 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b is running failed: container process not found" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 16:34:21 crc kubenswrapper[4778]: E1205 16:34:21.618785 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b is running failed: container process not found" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 16:34:21 crc kubenswrapper[4778]: E1205 16:34:21.619327 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b is running failed: container process not found" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 16:34:21 crc kubenswrapper[4778]: E1205 16:34:21.619414 4778 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b is running failed: container process not found" probeType="Startup" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:34:21 crc kubenswrapper[4778]: I1205 16:34:21.720499 4778 generic.go:334] "Generic (PLEG): container finished" podID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" exitCode=1 Dec 05 16:34:21 crc kubenswrapper[4778]: I1205 16:34:21.720583 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerDied","Data":"9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b"} Dec 05 16:34:21 crc kubenswrapper[4778]: I1205 16:34:21.720667 4778 scope.go:117] "RemoveContainer" containerID="ff8fc75a1611836768873515801ee0267e611aed1e2895a0d3b0fcb580c2a69f" Dec 05 16:34:21 crc kubenswrapper[4778]: I1205 16:34:21.721501 4778 scope.go:117] "RemoveContainer" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" Dec 05 16:34:21 crc kubenswrapper[4778]: E1205 16:34:21.721769 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:34:29 crc kubenswrapper[4778]: I1205 16:34:29.249847 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:34:29 crc kubenswrapper[4778]: E1205 16:34:29.250667 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:34:31 crc kubenswrapper[4778]: I1205 16:34:31.616383 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:34:31 crc kubenswrapper[4778]: I1205 16:34:31.617415 4778 scope.go:117] "RemoveContainer" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" Dec 05 16:34:31 crc kubenswrapper[4778]: E1205 16:34:31.617680 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:34:41 crc kubenswrapper[4778]: I1205 16:34:41.249291 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:34:41 crc kubenswrapper[4778]: E1205 16:34:41.250737 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:34:41 crc kubenswrapper[4778]: I1205 16:34:41.616126 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:34:41 crc kubenswrapper[4778]: I1205 16:34:41.616179 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:34:41 crc kubenswrapper[4778]: I1205 16:34:41.616802 4778 scope.go:117] "RemoveContainer" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" Dec 05 16:34:41 crc kubenswrapper[4778]: E1205 16:34:41.617013 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:34:41 crc kubenswrapper[4778]: I1205 16:34:41.972953 4778 scope.go:117] "RemoveContainer" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" Dec 05 16:34:41 crc kubenswrapper[4778]: E1205 16:34:41.974016 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:34:54 crc kubenswrapper[4778]: I1205 16:34:54.249534 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:34:54 crc kubenswrapper[4778]: I1205 16:34:54.250252 4778 scope.go:117] "RemoveContainer" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" Dec 05 16:34:54 crc kubenswrapper[4778]: E1205 16:34:54.250419 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:34:54 crc kubenswrapper[4778]: E1205 16:34:54.250715 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:35:06 crc kubenswrapper[4778]: I1205 16:35:06.249191 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:35:06 crc kubenswrapper[4778]: E1205 16:35:06.249946 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:35:08 crc kubenswrapper[4778]: I1205 16:35:08.250347 4778 scope.go:117] "RemoveContainer" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" Dec 05 16:35:08 crc kubenswrapper[4778]: E1205 16:35:08.251248 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:35:18 crc kubenswrapper[4778]: I1205 16:35:18.250845 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:35:18 crc kubenswrapper[4778]: E1205 16:35:18.251679 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:35:20 crc kubenswrapper[4778]: I1205 16:35:20.249844 4778 scope.go:117] "RemoveContainer" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" Dec 05 16:35:20 crc kubenswrapper[4778]: E1205 16:35:20.250591 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:35:30 crc kubenswrapper[4778]: I1205 16:35:30.249734 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:35:30 crc kubenswrapper[4778]: E1205 16:35:30.251301 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:35:32 crc kubenswrapper[4778]: I1205 16:35:32.249909 4778 scope.go:117] "RemoveContainer" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" Dec 05 16:35:32 crc kubenswrapper[4778]: E1205 16:35:32.250315 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:35:41 crc kubenswrapper[4778]: I1205 16:35:41.249510 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:35:41 crc kubenswrapper[4778]: E1205 16:35:41.250066 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:35:43 crc kubenswrapper[4778]: I1205 16:35:43.256752 4778 scope.go:117] "RemoveContainer" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" Dec 05 16:35:43 crc kubenswrapper[4778]: E1205 16:35:43.257098 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(f83b7df7-adc4-4c29-8805-a7c99abc7f29)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" Dec 05 16:35:54 crc kubenswrapper[4778]: I1205 16:35:54.250654 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:35:54 crc kubenswrapper[4778]: E1205 16:35:54.251405 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.452391 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd"] Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.467303 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-nv5qd"] Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.507669 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher3a3b-account-delete-889pf"] Dec 05 16:35:57 crc kubenswrapper[4778]: E1205 16:35:57.512972 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" containerName="extract-utilities" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.513069 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" containerName="extract-utilities" Dec 05 16:35:57 crc kubenswrapper[4778]: E1205 16:35:57.513147 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" containerName="extract-content" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.513209 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" containerName="extract-content" Dec 05 16:35:57 crc kubenswrapper[4778]: E1205 16:35:57.513286 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" containerName="registry-server" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.513345 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" containerName="registry-server" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.513568 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="59fa2df8-b0ca-4e81-b97e-bf84ece9fef5" containerName="registry-server" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.515186 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.528730 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher3a3b-account-delete-889pf"] Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.592479 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mxwr\" (UniqueName: \"kubernetes.io/projected/b4fe1652-89cd-474a-9b27-3011459b6ae0-kube-api-access-8mxwr\") pod \"watcher3a3b-account-delete-889pf\" (UID: \"b4fe1652-89cd-474a-9b27-3011459b6ae0\") " pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.592761 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fe1652-89cd-474a-9b27-3011459b6ae0-operator-scripts\") pod \"watcher3a3b-account-delete-889pf\" (UID: \"b4fe1652-89cd-474a-9b27-3011459b6ae0\") " pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.612253 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.627880 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.628126 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="5d6a2d36-d40e-4057-9fad-7995792e6351" containerName="watcher-applier" containerID="cri-o://3d1e99de35edfa2cdfc436bdd862bf05a87012ccc513d115875bd95886b85384" gracePeriod=30 Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.673192 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.673633 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="fae1f369-b35f-440d-9c9a-1319dd8a1dcd" containerName="watcher-kuttl-api-log" containerID="cri-o://6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b" gracePeriod=30 Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.673984 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="fae1f369-b35f-440d-9c9a-1319dd8a1dcd" containerName="watcher-api" containerID="cri-o://ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34" gracePeriod=30 Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.694460 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mxwr\" (UniqueName: \"kubernetes.io/projected/b4fe1652-89cd-474a-9b27-3011459b6ae0-kube-api-access-8mxwr\") pod \"watcher3a3b-account-delete-889pf\" (UID: \"b4fe1652-89cd-474a-9b27-3011459b6ae0\") " pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.694532 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fe1652-89cd-474a-9b27-3011459b6ae0-operator-scripts\") pod \"watcher3a3b-account-delete-889pf\" (UID: \"b4fe1652-89cd-474a-9b27-3011459b6ae0\") " pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.695432 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fe1652-89cd-474a-9b27-3011459b6ae0-operator-scripts\") pod \"watcher3a3b-account-delete-889pf\" (UID: \"b4fe1652-89cd-474a-9b27-3011459b6ae0\") " pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.717196 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mxwr\" (UniqueName: \"kubernetes.io/projected/b4fe1652-89cd-474a-9b27-3011459b6ae0-kube-api-access-8mxwr\") pod \"watcher3a3b-account-delete-889pf\" (UID: \"b4fe1652-89cd-474a-9b27-3011459b6ae0\") " pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" Dec 05 16:35:57 crc kubenswrapper[4778]: I1205 16:35:57.845556 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.011168 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.111239 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-config-data\") pod \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.111299 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83b7df7-adc4-4c29-8805-a7c99abc7f29-logs\") pod \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.111330 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-custom-prometheus-ca\") pod \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.111395 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52bb4\" (UniqueName: \"kubernetes.io/projected/f83b7df7-adc4-4c29-8805-a7c99abc7f29-kube-api-access-52bb4\") pod \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.111495 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-combined-ca-bundle\") pod \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\" (UID: \"f83b7df7-adc4-4c29-8805-a7c99abc7f29\") " Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.115828 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f83b7df7-adc4-4c29-8805-a7c99abc7f29-logs" (OuterVolumeSpecName: "logs") pod "f83b7df7-adc4-4c29-8805-a7c99abc7f29" (UID: "f83b7df7-adc4-4c29-8805-a7c99abc7f29"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.126348 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83b7df7-adc4-4c29-8805-a7c99abc7f29-kube-api-access-52bb4" (OuterVolumeSpecName: "kube-api-access-52bb4") pod "f83b7df7-adc4-4c29-8805-a7c99abc7f29" (UID: "f83b7df7-adc4-4c29-8805-a7c99abc7f29"). InnerVolumeSpecName "kube-api-access-52bb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.139736 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "f83b7df7-adc4-4c29-8805-a7c99abc7f29" (UID: "f83b7df7-adc4-4c29-8805-a7c99abc7f29"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.153191 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f83b7df7-adc4-4c29-8805-a7c99abc7f29" (UID: "f83b7df7-adc4-4c29-8805-a7c99abc7f29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.178771 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-config-data" (OuterVolumeSpecName: "config-data") pod "f83b7df7-adc4-4c29-8805-a7c99abc7f29" (UID: "f83b7df7-adc4-4c29-8805-a7c99abc7f29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.214357 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.214676 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83b7df7-adc4-4c29-8805-a7c99abc7f29-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.214688 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.214699 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52bb4\" (UniqueName: \"kubernetes.io/projected/f83b7df7-adc4-4c29-8805-a7c99abc7f29-kube-api-access-52bb4\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.214708 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83b7df7-adc4-4c29-8805-a7c99abc7f29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.451008 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher3a3b-account-delete-889pf"] Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.609383 4778 generic.go:334] "Generic (PLEG): container finished" podID="fae1f369-b35f-440d-9c9a-1319dd8a1dcd" containerID="6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b" exitCode=143 Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.609469 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"fae1f369-b35f-440d-9c9a-1319dd8a1dcd","Type":"ContainerDied","Data":"6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b"} Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.610903 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" event={"ID":"b4fe1652-89cd-474a-9b27-3011459b6ae0","Type":"ContainerStarted","Data":"af79d8e78be651822ce5095878935fd8c0685a54d25823ee61b559b1832ef62b"} Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.610934 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" event={"ID":"b4fe1652-89cd-474a-9b27-3011459b6ae0","Type":"ContainerStarted","Data":"8d6965638d004d6bef37b6dd2ae24d8b2f3045c50430ad7e391bc9bd5b8cc7b7"} Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.614499 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f83b7df7-adc4-4c29-8805-a7c99abc7f29","Type":"ContainerDied","Data":"3fa410e0bc6579e5f0d5b2a995f81cc7524ad4e69d70b60acbd4d1bfc87ede99"} Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.614548 4778 scope.go:117] "RemoveContainer" containerID="9274237b4195a3f5d60ebe7acdddc72848811f006aef9fff1ec45d57b3c1a22b" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.614680 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.634354 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" podStartSLOduration=1.634333603 podStartE2EDuration="1.634333603s" podCreationTimestamp="2025-12-05 16:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:35:58.628161909 +0000 UTC m=+2445.731958299" watchObservedRunningTime="2025-12-05 16:35:58.634333603 +0000 UTC m=+2445.738129973" Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.672337 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.681499 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:35:58 crc kubenswrapper[4778]: I1205 16:35:58.986222 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.025280 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-public-tls-certs\") pod \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.025331 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-internal-tls-certs\") pod \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.025352 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-combined-ca-bundle\") pod \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.025385 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-custom-prometheus-ca\") pod \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.025441 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2f4c\" (UniqueName: \"kubernetes.io/projected/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-kube-api-access-z2f4c\") pod \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.025470 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-config-data\") pod \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.025493 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-logs\") pod \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\" (UID: \"fae1f369-b35f-440d-9c9a-1319dd8a1dcd\") " Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.026157 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-logs" (OuterVolumeSpecName: "logs") pod "fae1f369-b35f-440d-9c9a-1319dd8a1dcd" (UID: "fae1f369-b35f-440d-9c9a-1319dd8a1dcd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.030813 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-kube-api-access-z2f4c" (OuterVolumeSpecName: "kube-api-access-z2f4c") pod "fae1f369-b35f-440d-9c9a-1319dd8a1dcd" (UID: "fae1f369-b35f-440d-9c9a-1319dd8a1dcd"). InnerVolumeSpecName "kube-api-access-z2f4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.054422 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "fae1f369-b35f-440d-9c9a-1319dd8a1dcd" (UID: "fae1f369-b35f-440d-9c9a-1319dd8a1dcd"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.070619 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fae1f369-b35f-440d-9c9a-1319dd8a1dcd" (UID: "fae1f369-b35f-440d-9c9a-1319dd8a1dcd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.071600 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fae1f369-b35f-440d-9c9a-1319dd8a1dcd" (UID: "fae1f369-b35f-440d-9c9a-1319dd8a1dcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.073101 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fae1f369-b35f-440d-9c9a-1319dd8a1dcd" (UID: "fae1f369-b35f-440d-9c9a-1319dd8a1dcd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.081231 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-config-data" (OuterVolumeSpecName: "config-data") pod "fae1f369-b35f-440d-9c9a-1319dd8a1dcd" (UID: "fae1f369-b35f-440d-9c9a-1319dd8a1dcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.126909 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2f4c\" (UniqueName: \"kubernetes.io/projected/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-kube-api-access-z2f4c\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.126946 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.126958 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.126968 4778 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.126979 4778 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.126988 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.127000 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fae1f369-b35f-440d-9c9a-1319dd8a1dcd-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.261542 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="879c7499-2da6-4974-a832-e969d58d34c3" path="/var/lib/kubelet/pods/879c7499-2da6-4974-a832-e969d58d34c3/volumes" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.262204 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" path="/var/lib/kubelet/pods/f83b7df7-adc4-4c29-8805-a7c99abc7f29/volumes" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.632401 4778 generic.go:334] "Generic (PLEG): container finished" podID="fae1f369-b35f-440d-9c9a-1319dd8a1dcd" containerID="ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34" exitCode=0 Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.632559 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.632590 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"fae1f369-b35f-440d-9c9a-1319dd8a1dcd","Type":"ContainerDied","Data":"ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34"} Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.632962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"fae1f369-b35f-440d-9c9a-1319dd8a1dcd","Type":"ContainerDied","Data":"e59227d4457a6eb3e19f8dd2369c5e28106c7bfa38ddd07e5bb6469df58ecf8c"} Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.633006 4778 scope.go:117] "RemoveContainer" containerID="ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.653875 4778 generic.go:334] "Generic (PLEG): container finished" podID="b4fe1652-89cd-474a-9b27-3011459b6ae0" containerID="af79d8e78be651822ce5095878935fd8c0685a54d25823ee61b559b1832ef62b" exitCode=0 Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.653975 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" event={"ID":"b4fe1652-89cd-474a-9b27-3011459b6ae0","Type":"ContainerDied","Data":"af79d8e78be651822ce5095878935fd8c0685a54d25823ee61b559b1832ef62b"} Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.693157 4778 scope.go:117] "RemoveContainer" containerID="6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.707215 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.715071 4778 scope.go:117] "RemoveContainer" containerID="ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34" Dec 05 16:35:59 crc kubenswrapper[4778]: E1205 16:35:59.715591 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34\": container with ID starting with ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34 not found: ID does not exist" containerID="ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.715632 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34"} err="failed to get container status \"ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34\": rpc error: code = NotFound desc = could not find container \"ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34\": container with ID starting with ce26ca7bdcf1d7ff9bbee000c080fb1636c4e8999b055307a0b69c7531790b34 not found: ID does not exist" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.715653 4778 scope.go:117] "RemoveContainer" containerID="6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b" Dec 05 16:35:59 crc kubenswrapper[4778]: E1205 16:35:59.716924 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b\": container with ID starting with 6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b not found: ID does not exist" containerID="6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.716989 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b"} err="failed to get container status \"6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b\": rpc error: code = NotFound desc = could not find container \"6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b\": container with ID starting with 6108f970946cd219990ce3d16f4e96067f78bf4f6221147de5515fe16d77fc1b not found: ID does not exist" Dec 05 16:35:59 crc kubenswrapper[4778]: I1205 16:35:59.725478 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:36:00 crc kubenswrapper[4778]: I1205 16:36:00.673572 4778 generic.go:334] "Generic (PLEG): container finished" podID="5d6a2d36-d40e-4057-9fad-7995792e6351" containerID="3d1e99de35edfa2cdfc436bdd862bf05a87012ccc513d115875bd95886b85384" exitCode=0 Dec 05 16:36:00 crc kubenswrapper[4778]: I1205 16:36:00.673632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"5d6a2d36-d40e-4057-9fad-7995792e6351","Type":"ContainerDied","Data":"3d1e99de35edfa2cdfc436bdd862bf05a87012ccc513d115875bd95886b85384"} Dec 05 16:36:00 crc kubenswrapper[4778]: I1205 16:36:00.795192 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:00 crc kubenswrapper[4778]: I1205 16:36:00.961868 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6a2d36-d40e-4057-9fad-7995792e6351-config-data\") pod \"5d6a2d36-d40e-4057-9fad-7995792e6351\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " Dec 05 16:36:00 crc kubenswrapper[4778]: I1205 16:36:00.961934 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6a2d36-d40e-4057-9fad-7995792e6351-combined-ca-bundle\") pod \"5d6a2d36-d40e-4057-9fad-7995792e6351\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " Dec 05 16:36:00 crc kubenswrapper[4778]: I1205 16:36:00.962059 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d6a2d36-d40e-4057-9fad-7995792e6351-logs\") pod \"5d6a2d36-d40e-4057-9fad-7995792e6351\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " Dec 05 16:36:00 crc kubenswrapper[4778]: I1205 16:36:00.962100 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5nll\" (UniqueName: \"kubernetes.io/projected/5d6a2d36-d40e-4057-9fad-7995792e6351-kube-api-access-w5nll\") pod \"5d6a2d36-d40e-4057-9fad-7995792e6351\" (UID: \"5d6a2d36-d40e-4057-9fad-7995792e6351\") " Dec 05 16:36:00 crc kubenswrapper[4778]: I1205 16:36:00.963352 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d6a2d36-d40e-4057-9fad-7995792e6351-logs" (OuterVolumeSpecName: "logs") pod "5d6a2d36-d40e-4057-9fad-7995792e6351" (UID: "5d6a2d36-d40e-4057-9fad-7995792e6351"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:36:00 crc kubenswrapper[4778]: I1205 16:36:00.967133 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d6a2d36-d40e-4057-9fad-7995792e6351-kube-api-access-w5nll" (OuterVolumeSpecName: "kube-api-access-w5nll") pod "5d6a2d36-d40e-4057-9fad-7995792e6351" (UID: "5d6a2d36-d40e-4057-9fad-7995792e6351"). InnerVolumeSpecName "kube-api-access-w5nll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:00 crc kubenswrapper[4778]: I1205 16:36:00.985493 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d6a2d36-d40e-4057-9fad-7995792e6351-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d6a2d36-d40e-4057-9fad-7995792e6351" (UID: "5d6a2d36-d40e-4057-9fad-7995792e6351"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.017841 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d6a2d36-d40e-4057-9fad-7995792e6351-config-data" (OuterVolumeSpecName: "config-data") pod "5d6a2d36-d40e-4057-9fad-7995792e6351" (UID: "5d6a2d36-d40e-4057-9fad-7995792e6351"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.028132 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.064151 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5nll\" (UniqueName: \"kubernetes.io/projected/5d6a2d36-d40e-4057-9fad-7995792e6351-kube-api-access-w5nll\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.064207 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6a2d36-d40e-4057-9fad-7995792e6351-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.064221 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6a2d36-d40e-4057-9fad-7995792e6351-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.064232 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d6a2d36-d40e-4057-9fad-7995792e6351-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.165684 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mxwr\" (UniqueName: \"kubernetes.io/projected/b4fe1652-89cd-474a-9b27-3011459b6ae0-kube-api-access-8mxwr\") pod \"b4fe1652-89cd-474a-9b27-3011459b6ae0\" (UID: \"b4fe1652-89cd-474a-9b27-3011459b6ae0\") " Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.165786 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fe1652-89cd-474a-9b27-3011459b6ae0-operator-scripts\") pod \"b4fe1652-89cd-474a-9b27-3011459b6ae0\" (UID: \"b4fe1652-89cd-474a-9b27-3011459b6ae0\") " Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.166229 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4fe1652-89cd-474a-9b27-3011459b6ae0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4fe1652-89cd-474a-9b27-3011459b6ae0" (UID: "b4fe1652-89cd-474a-9b27-3011459b6ae0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.166469 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fe1652-89cd-474a-9b27-3011459b6ae0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.170523 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4fe1652-89cd-474a-9b27-3011459b6ae0-kube-api-access-8mxwr" (OuterVolumeSpecName: "kube-api-access-8mxwr") pod "b4fe1652-89cd-474a-9b27-3011459b6ae0" (UID: "b4fe1652-89cd-474a-9b27-3011459b6ae0"). InnerVolumeSpecName "kube-api-access-8mxwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.264532 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fae1f369-b35f-440d-9c9a-1319dd8a1dcd" path="/var/lib/kubelet/pods/fae1f369-b35f-440d-9c9a-1319dd8a1dcd/volumes" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.268590 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mxwr\" (UniqueName: \"kubernetes.io/projected/b4fe1652-89cd-474a-9b27-3011459b6ae0-kube-api-access-8mxwr\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.686334 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.686323 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"5d6a2d36-d40e-4057-9fad-7995792e6351","Type":"ContainerDied","Data":"adab532763a91922a2129cf543dfe224758b6c2386dbd98862b845389cb2ec3e"} Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.686541 4778 scope.go:117] "RemoveContainer" containerID="3d1e99de35edfa2cdfc436bdd862bf05a87012ccc513d115875bd95886b85384" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.688703 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" event={"ID":"b4fe1652-89cd-474a-9b27-3011459b6ae0","Type":"ContainerDied","Data":"8d6965638d004d6bef37b6dd2ae24d8b2f3045c50430ad7e391bc9bd5b8cc7b7"} Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.688778 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d6965638d004d6bef37b6dd2ae24d8b2f3045c50430ad7e391bc9bd5b8cc7b7" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.688806 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3a3b-account-delete-889pf" Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.718940 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:36:01 crc kubenswrapper[4778]: I1205 16:36:01.726101 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.537601 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-vvs4k"] Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.543375 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-vvs4k"] Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.568198 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher3a3b-account-delete-889pf"] Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.578439 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb"] Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.587611 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher3a3b-account-delete-889pf"] Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.595980 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-3a3b-account-create-update-srfdb"] Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.642796 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-k8k7x"] Dec 05 16:36:02 crc kubenswrapper[4778]: E1205 16:36:02.643263 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643286 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: E1205 16:36:02.643302 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae1f369-b35f-440d-9c9a-1319dd8a1dcd" containerName="watcher-api" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643310 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae1f369-b35f-440d-9c9a-1319dd8a1dcd" containerName="watcher-api" Dec 05 16:36:02 crc kubenswrapper[4778]: E1205 16:36:02.643327 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d6a2d36-d40e-4057-9fad-7995792e6351" containerName="watcher-applier" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643335 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d6a2d36-d40e-4057-9fad-7995792e6351" containerName="watcher-applier" Dec 05 16:36:02 crc kubenswrapper[4778]: E1205 16:36:02.643342 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643349 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: E1205 16:36:02.643377 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643383 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: E1205 16:36:02.643394 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643402 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: E1205 16:36:02.643414 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae1f369-b35f-440d-9c9a-1319dd8a1dcd" containerName="watcher-kuttl-api-log" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643420 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae1f369-b35f-440d-9c9a-1319dd8a1dcd" containerName="watcher-kuttl-api-log" Dec 05 16:36:02 crc kubenswrapper[4778]: E1205 16:36:02.643441 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4fe1652-89cd-474a-9b27-3011459b6ae0" containerName="mariadb-account-delete" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643448 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4fe1652-89cd-474a-9b27-3011459b6ae0" containerName="mariadb-account-delete" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643672 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d6a2d36-d40e-4057-9fad-7995792e6351" containerName="watcher-applier" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643691 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643700 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643709 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643717 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643726 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4fe1652-89cd-474a-9b27-3011459b6ae0" containerName="mariadb-account-delete" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643740 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae1f369-b35f-440d-9c9a-1319dd8a1dcd" containerName="watcher-api" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.643747 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae1f369-b35f-440d-9c9a-1319dd8a1dcd" containerName="watcher-kuttl-api-log" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.650291 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-k8k7x" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.666799 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-k8k7x"] Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.695126 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjdcl\" (UniqueName: \"kubernetes.io/projected/3e3002d1-ef2d-4ae6-b699-149a3b7456cd-kube-api-access-wjdcl\") pod \"watcher-db-create-k8k7x\" (UID: \"3e3002d1-ef2d-4ae6-b699-149a3b7456cd\") " pod="watcher-kuttl-default/watcher-db-create-k8k7x" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.695197 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e3002d1-ef2d-4ae6-b699-149a3b7456cd-operator-scripts\") pod \"watcher-db-create-k8k7x\" (UID: \"3e3002d1-ef2d-4ae6-b699-149a3b7456cd\") " pod="watcher-kuttl-default/watcher-db-create-k8k7x" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.745267 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq"] Dec 05 16:36:02 crc kubenswrapper[4778]: E1205 16:36:02.745892 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.745924 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: E1205 16:36:02.745939 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.745946 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.746204 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.746222 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83b7df7-adc4-4c29-8805-a7c99abc7f29" containerName="watcher-decision-engine" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.746927 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.750715 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.762527 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq"] Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.797973 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjdcl\" (UniqueName: \"kubernetes.io/projected/3e3002d1-ef2d-4ae6-b699-149a3b7456cd-kube-api-access-wjdcl\") pod \"watcher-db-create-k8k7x\" (UID: \"3e3002d1-ef2d-4ae6-b699-149a3b7456cd\") " pod="watcher-kuttl-default/watcher-db-create-k8k7x" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.798049 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e3002d1-ef2d-4ae6-b699-149a3b7456cd-operator-scripts\") pod \"watcher-db-create-k8k7x\" (UID: \"3e3002d1-ef2d-4ae6-b699-149a3b7456cd\") " pod="watcher-kuttl-default/watcher-db-create-k8k7x" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.798857 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e3002d1-ef2d-4ae6-b699-149a3b7456cd-operator-scripts\") pod \"watcher-db-create-k8k7x\" (UID: \"3e3002d1-ef2d-4ae6-b699-149a3b7456cd\") " pod="watcher-kuttl-default/watcher-db-create-k8k7x" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.830222 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjdcl\" (UniqueName: \"kubernetes.io/projected/3e3002d1-ef2d-4ae6-b699-149a3b7456cd-kube-api-access-wjdcl\") pod \"watcher-db-create-k8k7x\" (UID: \"3e3002d1-ef2d-4ae6-b699-149a3b7456cd\") " pod="watcher-kuttl-default/watcher-db-create-k8k7x" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.900087 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztzcn\" (UniqueName: \"kubernetes.io/projected/44210ac8-050b-4b56-b2f5-3afe7deae253-kube-api-access-ztzcn\") pod \"watcher-e51a-account-create-update-7vwlq\" (UID: \"44210ac8-050b-4b56-b2f5-3afe7deae253\") " pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.900147 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44210ac8-050b-4b56-b2f5-3afe7deae253-operator-scripts\") pod \"watcher-e51a-account-create-update-7vwlq\" (UID: \"44210ac8-050b-4b56-b2f5-3afe7deae253\") " pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" Dec 05 16:36:02 crc kubenswrapper[4778]: I1205 16:36:02.969220 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-k8k7x" Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.002167 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztzcn\" (UniqueName: \"kubernetes.io/projected/44210ac8-050b-4b56-b2f5-3afe7deae253-kube-api-access-ztzcn\") pod \"watcher-e51a-account-create-update-7vwlq\" (UID: \"44210ac8-050b-4b56-b2f5-3afe7deae253\") " pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.002230 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44210ac8-050b-4b56-b2f5-3afe7deae253-operator-scripts\") pod \"watcher-e51a-account-create-update-7vwlq\" (UID: \"44210ac8-050b-4b56-b2f5-3afe7deae253\") " pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.003006 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44210ac8-050b-4b56-b2f5-3afe7deae253-operator-scripts\") pod \"watcher-e51a-account-create-update-7vwlq\" (UID: \"44210ac8-050b-4b56-b2f5-3afe7deae253\") " pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.030171 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztzcn\" (UniqueName: \"kubernetes.io/projected/44210ac8-050b-4b56-b2f5-3afe7deae253-kube-api-access-ztzcn\") pod \"watcher-e51a-account-create-update-7vwlq\" (UID: \"44210ac8-050b-4b56-b2f5-3afe7deae253\") " pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.098677 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.261847 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eaec35c-570b-4421-86b9-250313dfe459" path="/var/lib/kubelet/pods/4eaec35c-570b-4421-86b9-250313dfe459/volumes" Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.262389 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d6a2d36-d40e-4057-9fad-7995792e6351" path="/var/lib/kubelet/pods/5d6a2d36-d40e-4057-9fad-7995792e6351/volumes" Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.262920 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4fe1652-89cd-474a-9b27-3011459b6ae0" path="/var/lib/kubelet/pods/b4fe1652-89cd-474a-9b27-3011459b6ae0/volumes" Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.264176 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec32a4a4-dcba-4160-8acf-f9910c62c6b1" path="/var/lib/kubelet/pods/ec32a4a4-dcba-4160-8acf-f9910c62c6b1/volumes" Dec 05 16:36:03 crc kubenswrapper[4778]: W1205 16:36:03.427234 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e3002d1_ef2d_4ae6_b699_149a3b7456cd.slice/crio-e026a36f8e94e375977ea92cf485d53cb4034965541b440963a9ec3f34c17d1b WatchSource:0}: Error finding container e026a36f8e94e375977ea92cf485d53cb4034965541b440963a9ec3f34c17d1b: Status 404 returned error can't find the container with id e026a36f8e94e375977ea92cf485d53cb4034965541b440963a9ec3f34c17d1b Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.429174 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-k8k7x"] Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.566136 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq"] Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.708304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-k8k7x" event={"ID":"3e3002d1-ef2d-4ae6-b699-149a3b7456cd","Type":"ContainerStarted","Data":"af4d05c66fc16e524d095921d065b0c4c760e80b9b9e35e42987db9642e6dc63"} Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.708877 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-k8k7x" event={"ID":"3e3002d1-ef2d-4ae6-b699-149a3b7456cd","Type":"ContainerStarted","Data":"e026a36f8e94e375977ea92cf485d53cb4034965541b440963a9ec3f34c17d1b"} Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.710644 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" event={"ID":"44210ac8-050b-4b56-b2f5-3afe7deae253","Type":"ContainerStarted","Data":"8890a4b8e147fdf6b0bae667cd81333d065f5c80bc594666954dd5c40141c5a0"} Dec 05 16:36:03 crc kubenswrapper[4778]: I1205 16:36:03.729809 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db-create-k8k7x" podStartSLOduration=1.7297907719999999 podStartE2EDuration="1.729790772s" podCreationTimestamp="2025-12-05 16:36:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:36:03.724471784 +0000 UTC m=+2450.828268164" watchObservedRunningTime="2025-12-05 16:36:03.729790772 +0000 UTC m=+2450.833587152" Dec 05 16:36:04 crc kubenswrapper[4778]: I1205 16:36:04.721923 4778 generic.go:334] "Generic (PLEG): container finished" podID="3e3002d1-ef2d-4ae6-b699-149a3b7456cd" containerID="af4d05c66fc16e524d095921d065b0c4c760e80b9b9e35e42987db9642e6dc63" exitCode=0 Dec 05 16:36:04 crc kubenswrapper[4778]: I1205 16:36:04.722022 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-k8k7x" event={"ID":"3e3002d1-ef2d-4ae6-b699-149a3b7456cd","Type":"ContainerDied","Data":"af4d05c66fc16e524d095921d065b0c4c760e80b9b9e35e42987db9642e6dc63"} Dec 05 16:36:04 crc kubenswrapper[4778]: I1205 16:36:04.725179 4778 generic.go:334] "Generic (PLEG): container finished" podID="44210ac8-050b-4b56-b2f5-3afe7deae253" containerID="8c02c7fb994d4e7c1a3cbe51a221bb427053b507384aa8cc52722451bdf97b18" exitCode=0 Dec 05 16:36:04 crc kubenswrapper[4778]: I1205 16:36:04.725234 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" event={"ID":"44210ac8-050b-4b56-b2f5-3afe7deae253","Type":"ContainerDied","Data":"8c02c7fb994d4e7c1a3cbe51a221bb427053b507384aa8cc52722451bdf97b18"} Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.208403 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-k8k7x" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.214756 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.328271 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fj769"] Dec 05 16:36:06 crc kubenswrapper[4778]: E1205 16:36:06.328699 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44210ac8-050b-4b56-b2f5-3afe7deae253" containerName="mariadb-account-create-update" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.328722 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="44210ac8-050b-4b56-b2f5-3afe7deae253" containerName="mariadb-account-create-update" Dec 05 16:36:06 crc kubenswrapper[4778]: E1205 16:36:06.328750 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3002d1-ef2d-4ae6-b699-149a3b7456cd" containerName="mariadb-database-create" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.328757 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3002d1-ef2d-4ae6-b699-149a3b7456cd" containerName="mariadb-database-create" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.328901 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3002d1-ef2d-4ae6-b699-149a3b7456cd" containerName="mariadb-database-create" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.328922 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="44210ac8-050b-4b56-b2f5-3afe7deae253" containerName="mariadb-account-create-update" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.330024 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.335690 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj769"] Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.369215 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztzcn\" (UniqueName: \"kubernetes.io/projected/44210ac8-050b-4b56-b2f5-3afe7deae253-kube-api-access-ztzcn\") pod \"44210ac8-050b-4b56-b2f5-3afe7deae253\" (UID: \"44210ac8-050b-4b56-b2f5-3afe7deae253\") " Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.369265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e3002d1-ef2d-4ae6-b699-149a3b7456cd-operator-scripts\") pod \"3e3002d1-ef2d-4ae6-b699-149a3b7456cd\" (UID: \"3e3002d1-ef2d-4ae6-b699-149a3b7456cd\") " Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.369391 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44210ac8-050b-4b56-b2f5-3afe7deae253-operator-scripts\") pod \"44210ac8-050b-4b56-b2f5-3afe7deae253\" (UID: \"44210ac8-050b-4b56-b2f5-3afe7deae253\") " Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.369452 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjdcl\" (UniqueName: \"kubernetes.io/projected/3e3002d1-ef2d-4ae6-b699-149a3b7456cd-kube-api-access-wjdcl\") pod \"3e3002d1-ef2d-4ae6-b699-149a3b7456cd\" (UID: \"3e3002d1-ef2d-4ae6-b699-149a3b7456cd\") " Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.373749 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3002d1-ef2d-4ae6-b699-149a3b7456cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e3002d1-ef2d-4ae6-b699-149a3b7456cd" (UID: "3e3002d1-ef2d-4ae6-b699-149a3b7456cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.376151 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44210ac8-050b-4b56-b2f5-3afe7deae253-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44210ac8-050b-4b56-b2f5-3afe7deae253" (UID: "44210ac8-050b-4b56-b2f5-3afe7deae253"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.380382 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3002d1-ef2d-4ae6-b699-149a3b7456cd-kube-api-access-wjdcl" (OuterVolumeSpecName: "kube-api-access-wjdcl") pod "3e3002d1-ef2d-4ae6-b699-149a3b7456cd" (UID: "3e3002d1-ef2d-4ae6-b699-149a3b7456cd"). InnerVolumeSpecName "kube-api-access-wjdcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.400714 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44210ac8-050b-4b56-b2f5-3afe7deae253-kube-api-access-ztzcn" (OuterVolumeSpecName: "kube-api-access-ztzcn") pod "44210ac8-050b-4b56-b2f5-3afe7deae253" (UID: "44210ac8-050b-4b56-b2f5-3afe7deae253"). InnerVolumeSpecName "kube-api-access-ztzcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.471565 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-catalog-content\") pod \"redhat-marketplace-fj769\" (UID: \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\") " pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.471785 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dch2g\" (UniqueName: \"kubernetes.io/projected/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-kube-api-access-dch2g\") pod \"redhat-marketplace-fj769\" (UID: \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\") " pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.471853 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-utilities\") pod \"redhat-marketplace-fj769\" (UID: \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\") " pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.471991 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjdcl\" (UniqueName: \"kubernetes.io/projected/3e3002d1-ef2d-4ae6-b699-149a3b7456cd-kube-api-access-wjdcl\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.472009 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztzcn\" (UniqueName: \"kubernetes.io/projected/44210ac8-050b-4b56-b2f5-3afe7deae253-kube-api-access-ztzcn\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.472019 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e3002d1-ef2d-4ae6-b699-149a3b7456cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.472029 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44210ac8-050b-4b56-b2f5-3afe7deae253-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.573345 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-catalog-content\") pod \"redhat-marketplace-fj769\" (UID: \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\") " pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.573467 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dch2g\" (UniqueName: \"kubernetes.io/projected/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-kube-api-access-dch2g\") pod \"redhat-marketplace-fj769\" (UID: \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\") " pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.573493 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-utilities\") pod \"redhat-marketplace-fj769\" (UID: \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\") " pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.573864 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-catalog-content\") pod \"redhat-marketplace-fj769\" (UID: \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\") " pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.573952 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-utilities\") pod \"redhat-marketplace-fj769\" (UID: \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\") " pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.590257 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dch2g\" (UniqueName: \"kubernetes.io/projected/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-kube-api-access-dch2g\") pod \"redhat-marketplace-fj769\" (UID: \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\") " pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.644393 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.750141 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.751514 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq" event={"ID":"44210ac8-050b-4b56-b2f5-3afe7deae253","Type":"ContainerDied","Data":"8890a4b8e147fdf6b0bae667cd81333d065f5c80bc594666954dd5c40141c5a0"} Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.751570 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8890a4b8e147fdf6b0bae667cd81333d065f5c80bc594666954dd5c40141c5a0" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.767315 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-k8k7x" event={"ID":"3e3002d1-ef2d-4ae6-b699-149a3b7456cd","Type":"ContainerDied","Data":"e026a36f8e94e375977ea92cf485d53cb4034965541b440963a9ec3f34c17d1b"} Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.767384 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e026a36f8e94e375977ea92cf485d53cb4034965541b440963a9ec3f34c17d1b" Dec 05 16:36:06 crc kubenswrapper[4778]: I1205 16:36:06.767457 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-k8k7x" Dec 05 16:36:07 crc kubenswrapper[4778]: I1205 16:36:07.119429 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj769"] Dec 05 16:36:07 crc kubenswrapper[4778]: W1205 16:36:07.122465 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83b5e71a_c4e5_4da7_bd2c_d06ef73d657c.slice/crio-dd673f3495d4097020ef5d7e63487695cfe5a30f045c8c2c653d16c6a971c4aa WatchSource:0}: Error finding container dd673f3495d4097020ef5d7e63487695cfe5a30f045c8c2c653d16c6a971c4aa: Status 404 returned error can't find the container with id dd673f3495d4097020ef5d7e63487695cfe5a30f045c8c2c653d16c6a971c4aa Dec 05 16:36:07 crc kubenswrapper[4778]: I1205 16:36:07.249769 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:36:07 crc kubenswrapper[4778]: E1205 16:36:07.250046 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:36:07 crc kubenswrapper[4778]: I1205 16:36:07.777556 4778 generic.go:334] "Generic (PLEG): container finished" podID="83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" containerID="1fc241b7718584a46d6661a39bd29ca4f09e943372767b2a0cbadc8ba29c2e2f" exitCode=0 Dec 05 16:36:07 crc kubenswrapper[4778]: I1205 16:36:07.777739 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj769" event={"ID":"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c","Type":"ContainerDied","Data":"1fc241b7718584a46d6661a39bd29ca4f09e943372767b2a0cbadc8ba29c2e2f"} Dec 05 16:36:07 crc kubenswrapper[4778]: I1205 16:36:07.778034 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj769" event={"ID":"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c","Type":"ContainerStarted","Data":"dd673f3495d4097020ef5d7e63487695cfe5a30f045c8c2c653d16c6a971c4aa"} Dec 05 16:36:07 crc kubenswrapper[4778]: I1205 16:36:07.780884 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.000737 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf"] Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.002161 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.006259 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.006814 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-2vg56" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.013380 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf"] Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.196968 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-zj9vf\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.197030 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-db-sync-config-data\") pod \"watcher-kuttl-db-sync-zj9vf\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.197135 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwkqg\" (UniqueName: \"kubernetes.io/projected/2588746d-a6c5-4c07-b515-7ea2429723cb-kube-api-access-nwkqg\") pod \"watcher-kuttl-db-sync-zj9vf\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.197250 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-config-data\") pod \"watcher-kuttl-db-sync-zj9vf\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.301091 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwkqg\" (UniqueName: \"kubernetes.io/projected/2588746d-a6c5-4c07-b515-7ea2429723cb-kube-api-access-nwkqg\") pod \"watcher-kuttl-db-sync-zj9vf\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.302229 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-config-data\") pod \"watcher-kuttl-db-sync-zj9vf\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.302762 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-zj9vf\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.302832 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-db-sync-config-data\") pod \"watcher-kuttl-db-sync-zj9vf\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.310477 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-db-sync-config-data\") pod \"watcher-kuttl-db-sync-zj9vf\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.311180 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-config-data\") pod \"watcher-kuttl-db-sync-zj9vf\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.320449 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-zj9vf\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.327206 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwkqg\" (UniqueName: \"kubernetes.io/projected/2588746d-a6c5-4c07-b515-7ea2429723cb-kube-api-access-nwkqg\") pod \"watcher-kuttl-db-sync-zj9vf\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:08 crc kubenswrapper[4778]: I1205 16:36:08.617764 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:09 crc kubenswrapper[4778]: I1205 16:36:09.105500 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf"] Dec 05 16:36:09 crc kubenswrapper[4778]: I1205 16:36:09.796563 4778 generic.go:334] "Generic (PLEG): container finished" podID="83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" containerID="343138e2eb4ae3926491532e53293f0ab41237a5fbd9065af12a59848488ad59" exitCode=0 Dec 05 16:36:09 crc kubenswrapper[4778]: I1205 16:36:09.796646 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj769" event={"ID":"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c","Type":"ContainerDied","Data":"343138e2eb4ae3926491532e53293f0ab41237a5fbd9065af12a59848488ad59"} Dec 05 16:36:09 crc kubenswrapper[4778]: I1205 16:36:09.799197 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" event={"ID":"2588746d-a6c5-4c07-b515-7ea2429723cb","Type":"ContainerStarted","Data":"750a95dfb36f3ecd21cb0dfc57217e9d88cad3ab11c7ad900fabde8499f47b2b"} Dec 05 16:36:09 crc kubenswrapper[4778]: I1205 16:36:09.799257 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" event={"ID":"2588746d-a6c5-4c07-b515-7ea2429723cb","Type":"ContainerStarted","Data":"1a0e535a28cc60c71f8246fa3409ac764fb1b18577285f2e931210765078aa9d"} Dec 05 16:36:09 crc kubenswrapper[4778]: I1205 16:36:09.844899 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" podStartSLOduration=2.844881964 podStartE2EDuration="2.844881964s" podCreationTimestamp="2025-12-05 16:36:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:36:09.839147913 +0000 UTC m=+2456.942944333" watchObservedRunningTime="2025-12-05 16:36:09.844881964 +0000 UTC m=+2456.948678344" Dec 05 16:36:10 crc kubenswrapper[4778]: I1205 16:36:10.809690 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj769" event={"ID":"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c","Type":"ContainerStarted","Data":"aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9"} Dec 05 16:36:10 crc kubenswrapper[4778]: I1205 16:36:10.837648 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fj769" podStartSLOduration=2.4238440150000002 podStartE2EDuration="4.837630175s" podCreationTimestamp="2025-12-05 16:36:06 +0000 UTC" firstStartedPulling="2025-12-05 16:36:07.78045216 +0000 UTC m=+2454.884248540" lastFinishedPulling="2025-12-05 16:36:10.19423829 +0000 UTC m=+2457.298034700" observedRunningTime="2025-12-05 16:36:10.832344788 +0000 UTC m=+2457.936141178" watchObservedRunningTime="2025-12-05 16:36:10.837630175 +0000 UTC m=+2457.941426565" Dec 05 16:36:11 crc kubenswrapper[4778]: I1205 16:36:11.818398 4778 generic.go:334] "Generic (PLEG): container finished" podID="2588746d-a6c5-4c07-b515-7ea2429723cb" containerID="750a95dfb36f3ecd21cb0dfc57217e9d88cad3ab11c7ad900fabde8499f47b2b" exitCode=0 Dec 05 16:36:11 crc kubenswrapper[4778]: I1205 16:36:11.819442 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" event={"ID":"2588746d-a6c5-4c07-b515-7ea2429723cb","Type":"ContainerDied","Data":"750a95dfb36f3ecd21cb0dfc57217e9d88cad3ab11c7ad900fabde8499f47b2b"} Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.126045 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.221534 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-config-data\") pod \"2588746d-a6c5-4c07-b515-7ea2429723cb\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.221632 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-db-sync-config-data\") pod \"2588746d-a6c5-4c07-b515-7ea2429723cb\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.221677 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-combined-ca-bundle\") pod \"2588746d-a6c5-4c07-b515-7ea2429723cb\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.221724 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwkqg\" (UniqueName: \"kubernetes.io/projected/2588746d-a6c5-4c07-b515-7ea2429723cb-kube-api-access-nwkqg\") pod \"2588746d-a6c5-4c07-b515-7ea2429723cb\" (UID: \"2588746d-a6c5-4c07-b515-7ea2429723cb\") " Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.226578 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2588746d-a6c5-4c07-b515-7ea2429723cb" (UID: "2588746d-a6c5-4c07-b515-7ea2429723cb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.227063 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2588746d-a6c5-4c07-b515-7ea2429723cb-kube-api-access-nwkqg" (OuterVolumeSpecName: "kube-api-access-nwkqg") pod "2588746d-a6c5-4c07-b515-7ea2429723cb" (UID: "2588746d-a6c5-4c07-b515-7ea2429723cb"). InnerVolumeSpecName "kube-api-access-nwkqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.245586 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2588746d-a6c5-4c07-b515-7ea2429723cb" (UID: "2588746d-a6c5-4c07-b515-7ea2429723cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.270692 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-config-data" (OuterVolumeSpecName: "config-data") pod "2588746d-a6c5-4c07-b515-7ea2429723cb" (UID: "2588746d-a6c5-4c07-b515-7ea2429723cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.327832 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwkqg\" (UniqueName: \"kubernetes.io/projected/2588746d-a6c5-4c07-b515-7ea2429723cb-kube-api-access-nwkqg\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.327870 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.327880 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.327896 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2588746d-a6c5-4c07-b515-7ea2429723cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.838189 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" event={"ID":"2588746d-a6c5-4c07-b515-7ea2429723cb","Type":"ContainerDied","Data":"1a0e535a28cc60c71f8246fa3409ac764fb1b18577285f2e931210765078aa9d"} Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.838249 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a0e535a28cc60c71f8246fa3409ac764fb1b18577285f2e931210765078aa9d" Dec 05 16:36:13 crc kubenswrapper[4778]: I1205 16:36:13.838671 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.153028 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:36:14 crc kubenswrapper[4778]: E1205 16:36:14.153894 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2588746d-a6c5-4c07-b515-7ea2429723cb" containerName="watcher-kuttl-db-sync" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.153914 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2588746d-a6c5-4c07-b515-7ea2429723cb" containerName="watcher-kuttl-db-sync" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.154129 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2588746d-a6c5-4c07-b515-7ea2429723cb" containerName="watcher-kuttl-db-sync" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.155477 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.158180 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.162562 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.163091 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-2vg56" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.164221 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.187070 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.197648 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.234463 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.235789 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.242347 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.242845 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243565 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de5783-faa3-466b-8121-69d6c8dcb01b-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243636 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243664 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61de5783-faa3-466b-8121-69d6c8dcb01b-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243691 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-logs\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243710 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243752 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243777 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243802 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7jv\" (UniqueName: \"kubernetes.io/projected/61de5783-faa3-466b-8121-69d6c8dcb01b-kube-api-access-gn7jv\") pod \"watcher-kuttl-applier-0\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243850 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243871 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d620ac9-3637-4725-8ba8-bd2573ecd345-logs\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243902 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243921 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de5783-faa3-466b-8121-69d6c8dcb01b-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243942 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k454g\" (UniqueName: \"kubernetes.io/projected/0d620ac9-3637-4725-8ba8-bd2573ecd345-kube-api-access-k454g\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.243969 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4xfv\" (UniqueName: \"kubernetes.io/projected/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-kube-api-access-z4xfv\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.313424 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.314498 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.320721 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.342395 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346132 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346195 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346214 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346235 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn7jv\" (UniqueName: \"kubernetes.io/projected/61de5783-faa3-466b-8121-69d6c8dcb01b-kube-api-access-gn7jv\") pod \"watcher-kuttl-applier-0\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346279 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346299 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d620ac9-3637-4725-8ba8-bd2573ecd345-logs\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346325 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346343 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de5783-faa3-466b-8121-69d6c8dcb01b-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346377 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k454g\" (UniqueName: \"kubernetes.io/projected/0d620ac9-3637-4725-8ba8-bd2573ecd345-kube-api-access-k454g\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346401 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4xfv\" (UniqueName: \"kubernetes.io/projected/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-kube-api-access-z4xfv\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346421 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de5783-faa3-466b-8121-69d6c8dcb01b-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346442 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346458 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61de5783-faa3-466b-8121-69d6c8dcb01b-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346476 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-logs\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.346920 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-logs\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.358530 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d620ac9-3637-4725-8ba8-bd2573ecd345-logs\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.360868 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61de5783-faa3-466b-8121-69d6c8dcb01b-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.370683 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de5783-faa3-466b-8121-69d6c8dcb01b-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.371992 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.372466 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.372849 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.375956 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.376195 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.395431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de5783-faa3-466b-8121-69d6c8dcb01b-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.400247 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.400554 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4xfv\" (UniqueName: \"kubernetes.io/projected/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-kube-api-access-z4xfv\") pod \"watcher-kuttl-api-1\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.400937 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k454g\" (UniqueName: \"kubernetes.io/projected/0d620ac9-3637-4725-8ba8-bd2573ecd345-kube-api-access-k454g\") pod \"watcher-kuttl-api-0\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.407949 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn7jv\" (UniqueName: \"kubernetes.io/projected/61de5783-faa3-466b-8121-69d6c8dcb01b-kube-api-access-gn7jv\") pod \"watcher-kuttl-applier-0\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.454251 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.454360 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfg9t\" (UniqueName: \"kubernetes.io/projected/233d3868-b9e1-4500-8f58-101a60f83778-kube-api-access-xfg9t\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.454417 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.454436 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233d3868-b9e1-4500-8f58-101a60f83778-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.454479 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.477813 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.483791 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.558212 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.558261 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233d3868-b9e1-4500-8f58-101a60f83778-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.558304 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.558399 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.558436 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfg9t\" (UniqueName: \"kubernetes.io/projected/233d3868-b9e1-4500-8f58-101a60f83778-kube-api-access-xfg9t\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.561772 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233d3868-b9e1-4500-8f58-101a60f83778-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.562145 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.568185 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.573574 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.576038 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.592293 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfg9t\" (UniqueName: \"kubernetes.io/projected/233d3868-b9e1-4500-8f58-101a60f83778-kube-api-access-xfg9t\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:14 crc kubenswrapper[4778]: I1205 16:36:14.632863 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:15 crc kubenswrapper[4778]: I1205 16:36:15.077771 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:36:15 crc kubenswrapper[4778]: W1205 16:36:15.089349 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d620ac9_3637_4725_8ba8_bd2573ecd345.slice/crio-0b4ab3b52ba86aa45397b582429a9cab312e24abf6f1cf4772a0290bd32de442 WatchSource:0}: Error finding container 0b4ab3b52ba86aa45397b582429a9cab312e24abf6f1cf4772a0290bd32de442: Status 404 returned error can't find the container with id 0b4ab3b52ba86aa45397b582429a9cab312e24abf6f1cf4772a0290bd32de442 Dec 05 16:36:15 crc kubenswrapper[4778]: I1205 16:36:15.146193 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 16:36:15 crc kubenswrapper[4778]: W1205 16:36:15.152846 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod070175cf_65a4_4bbb_a5c0_1c3fe31fb2fd.slice/crio-6ed5ae9c071462facac76dc86718bbe4f9e382f7e68bcb6f3907ab4ddd32fccb WatchSource:0}: Error finding container 6ed5ae9c071462facac76dc86718bbe4f9e382f7e68bcb6f3907ab4ddd32fccb: Status 404 returned error can't find the container with id 6ed5ae9c071462facac76dc86718bbe4f9e382f7e68bcb6f3907ab4ddd32fccb Dec 05 16:36:15 crc kubenswrapper[4778]: W1205 16:36:15.273574 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod233d3868_b9e1_4500_8f58_101a60f83778.slice/crio-8d75f24abdf0feb656e183c7b84ded8b54401955b98da6b7095809e76b9559c4 WatchSource:0}: Error finding container 8d75f24abdf0feb656e183c7b84ded8b54401955b98da6b7095809e76b9559c4: Status 404 returned error can't find the container with id 8d75f24abdf0feb656e183c7b84ded8b54401955b98da6b7095809e76b9559c4 Dec 05 16:36:15 crc kubenswrapper[4778]: I1205 16:36:15.275231 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:36:15 crc kubenswrapper[4778]: I1205 16:36:15.787619 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:36:15 crc kubenswrapper[4778]: W1205 16:36:15.799034 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61de5783_faa3_466b_8121_69d6c8dcb01b.slice/crio-59df40978a926405a7dfa5f506a9c0ce6e82d32a90e3a766ea05ef30f4f5938f WatchSource:0}: Error finding container 59df40978a926405a7dfa5f506a9c0ce6e82d32a90e3a766ea05ef30f4f5938f: Status 404 returned error can't find the container with id 59df40978a926405a7dfa5f506a9c0ce6e82d32a90e3a766ea05ef30f4f5938f Dec 05 16:36:15 crc kubenswrapper[4778]: I1205 16:36:15.859042 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerStarted","Data":"b548a1e38b1eab8a4703cec78875747636890351ce77d9863c3ba9ad610c2b06"} Dec 05 16:36:15 crc kubenswrapper[4778]: I1205 16:36:15.859085 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerStarted","Data":"8d75f24abdf0feb656e183c7b84ded8b54401955b98da6b7095809e76b9559c4"} Dec 05 16:36:15 crc kubenswrapper[4778]: I1205 16:36:15.863158 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd","Type":"ContainerStarted","Data":"093eea9ca1289144db233d06bdf0c20c91f488ca2e07beacc0855db06ecb65e1"} Dec 05 16:36:15 crc kubenswrapper[4778]: I1205 16:36:15.863186 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd","Type":"ContainerStarted","Data":"6ed5ae9c071462facac76dc86718bbe4f9e382f7e68bcb6f3907ab4ddd32fccb"} Dec 05 16:36:15 crc kubenswrapper[4778]: I1205 16:36:15.870687 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"61de5783-faa3-466b-8121-69d6c8dcb01b","Type":"ContainerStarted","Data":"59df40978a926405a7dfa5f506a9c0ce6e82d32a90e3a766ea05ef30f4f5938f"} Dec 05 16:36:15 crc kubenswrapper[4778]: I1205 16:36:15.872089 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"0d620ac9-3637-4725-8ba8-bd2573ecd345","Type":"ContainerStarted","Data":"ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e"} Dec 05 16:36:15 crc kubenswrapper[4778]: I1205 16:36:15.872115 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"0d620ac9-3637-4725-8ba8-bd2573ecd345","Type":"ContainerStarted","Data":"0b4ab3b52ba86aa45397b582429a9cab312e24abf6f1cf4772a0290bd32de442"} Dec 05 16:36:15 crc kubenswrapper[4778]: I1205 16:36:15.887116 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.887095404 podStartE2EDuration="1.887095404s" podCreationTimestamp="2025-12-05 16:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:36:15.880009444 +0000 UTC m=+2462.983805844" watchObservedRunningTime="2025-12-05 16:36:15.887095404 +0000 UTC m=+2462.990891784" Dec 05 16:36:16 crc kubenswrapper[4778]: I1205 16:36:16.645077 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:16 crc kubenswrapper[4778]: I1205 16:36:16.645435 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:16 crc kubenswrapper[4778]: I1205 16:36:16.696606 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:16 crc kubenswrapper[4778]: I1205 16:36:16.880822 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"0d620ac9-3637-4725-8ba8-bd2573ecd345","Type":"ContainerStarted","Data":"59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40"} Dec 05 16:36:16 crc kubenswrapper[4778]: I1205 16:36:16.881272 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:16 crc kubenswrapper[4778]: I1205 16:36:16.884718 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd","Type":"ContainerStarted","Data":"1d46478fcd519b2f1645f0afe8864e337e6bc48b4e273bfc7928c0748ea9ea3f"} Dec 05 16:36:16 crc kubenswrapper[4778]: I1205 16:36:16.884966 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:16 crc kubenswrapper[4778]: I1205 16:36:16.888398 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"61de5783-faa3-466b-8121-69d6c8dcb01b","Type":"ContainerStarted","Data":"c946218ff5881070cb0218aceb13162a4c809c65d77f11a88834c149b5e58d12"} Dec 05 16:36:16 crc kubenswrapper[4778]: I1205 16:36:16.910135 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.910111537 podStartE2EDuration="2.910111537s" podCreationTimestamp="2025-12-05 16:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:36:16.900999726 +0000 UTC m=+2464.004796116" watchObservedRunningTime="2025-12-05 16:36:16.910111537 +0000 UTC m=+2464.013907927" Dec 05 16:36:16 crc kubenswrapper[4778]: I1205 16:36:16.937890 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-1" podStartSLOduration=2.937869704 podStartE2EDuration="2.937869704s" podCreationTimestamp="2025-12-05 16:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:36:16.925467655 +0000 UTC m=+2464.029264045" watchObservedRunningTime="2025-12-05 16:36:16.937869704 +0000 UTC m=+2464.041666094" Dec 05 16:36:16 crc kubenswrapper[4778]: I1205 16:36:16.954473 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:16 crc kubenswrapper[4778]: I1205 16:36:16.955334 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.9553139330000002 podStartE2EDuration="2.955313933s" podCreationTimestamp="2025-12-05 16:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:36:16.948687706 +0000 UTC m=+2464.052484096" watchObservedRunningTime="2025-12-05 16:36:16.955313933 +0000 UTC m=+2464.059110333" Dec 05 16:36:18 crc kubenswrapper[4778]: I1205 16:36:18.909530 4778 generic.go:334] "Generic (PLEG): container finished" podID="233d3868-b9e1-4500-8f58-101a60f83778" containerID="b548a1e38b1eab8a4703cec78875747636890351ce77d9863c3ba9ad610c2b06" exitCode=1 Dec 05 16:36:18 crc kubenswrapper[4778]: I1205 16:36:18.909583 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerDied","Data":"b548a1e38b1eab8a4703cec78875747636890351ce77d9863c3ba9ad610c2b06"} Dec 05 16:36:18 crc kubenswrapper[4778]: I1205 16:36:18.910355 4778 scope.go:117] "RemoveContainer" containerID="b548a1e38b1eab8a4703cec78875747636890351ce77d9863c3ba9ad610c2b06" Dec 05 16:36:19 crc kubenswrapper[4778]: I1205 16:36:19.028420 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:19 crc kubenswrapper[4778]: I1205 16:36:19.251614 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:36:19 crc kubenswrapper[4778]: E1205 16:36:19.251910 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:36:19 crc kubenswrapper[4778]: I1205 16:36:19.371096 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:19 crc kubenswrapper[4778]: I1205 16:36:19.478521 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:19 crc kubenswrapper[4778]: I1205 16:36:19.485037 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:19 crc kubenswrapper[4778]: I1205 16:36:19.563168 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:19 crc kubenswrapper[4778]: I1205 16:36:19.919293 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerStarted","Data":"b6592f42fe79dcc3e52b2c8c1ae84ee4bad362a15a4239e63110eb0b55880bbf"} Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.317570 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj769"] Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.317899 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fj769" podUID="83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" containerName="registry-server" containerID="cri-o://aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9" gracePeriod=2 Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.836576 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.897812 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-catalog-content\") pod \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\" (UID: \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\") " Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.898004 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dch2g\" (UniqueName: \"kubernetes.io/projected/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-kube-api-access-dch2g\") pod \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\" (UID: \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\") " Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.898071 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-utilities\") pod \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\" (UID: \"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c\") " Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.899175 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-utilities" (OuterVolumeSpecName: "utilities") pod "83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" (UID: "83b5e71a-c4e5-4da7-bd2c-d06ef73d657c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.904400 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-kube-api-access-dch2g" (OuterVolumeSpecName: "kube-api-access-dch2g") pod "83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" (UID: "83b5e71a-c4e5-4da7-bd2c-d06ef73d657c"). InnerVolumeSpecName "kube-api-access-dch2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.922439 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" (UID: "83b5e71a-c4e5-4da7-bd2c-d06ef73d657c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.928223 4778 generic.go:334] "Generic (PLEG): container finished" podID="83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" containerID="aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9" exitCode=0 Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.928301 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj769" Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.928305 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj769" event={"ID":"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c","Type":"ContainerDied","Data":"aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9"} Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.928395 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj769" event={"ID":"83b5e71a-c4e5-4da7-bd2c-d06ef73d657c","Type":"ContainerDied","Data":"dd673f3495d4097020ef5d7e63487695cfe5a30f045c8c2c653d16c6a971c4aa"} Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.928415 4778 scope.go:117] "RemoveContainer" containerID="aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9" Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.950152 4778 scope.go:117] "RemoveContainer" containerID="343138e2eb4ae3926491532e53293f0ab41237a5fbd9065af12a59848488ad59" Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.967162 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj769"] Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.982961 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj769"] Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.999603 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.999643 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:20 crc kubenswrapper[4778]: I1205 16:36:20.999658 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dch2g\" (UniqueName: \"kubernetes.io/projected/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c-kube-api-access-dch2g\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:21 crc kubenswrapper[4778]: I1205 16:36:21.001422 4778 scope.go:117] "RemoveContainer" containerID="1fc241b7718584a46d6661a39bd29ca4f09e943372767b2a0cbadc8ba29c2e2f" Dec 05 16:36:21 crc kubenswrapper[4778]: I1205 16:36:21.019718 4778 scope.go:117] "RemoveContainer" containerID="aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9" Dec 05 16:36:21 crc kubenswrapper[4778]: E1205 16:36:21.020087 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9\": container with ID starting with aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9 not found: ID does not exist" containerID="aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9" Dec 05 16:36:21 crc kubenswrapper[4778]: I1205 16:36:21.020121 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9"} err="failed to get container status \"aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9\": rpc error: code = NotFound desc = could not find container \"aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9\": container with ID starting with aaa3a79c055e3b0ab18d5fb45a952b0e59f97ca4a37240f367acd1ce40cf93e9 not found: ID does not exist" Dec 05 16:36:21 crc kubenswrapper[4778]: I1205 16:36:21.020149 4778 scope.go:117] "RemoveContainer" containerID="343138e2eb4ae3926491532e53293f0ab41237a5fbd9065af12a59848488ad59" Dec 05 16:36:21 crc kubenswrapper[4778]: E1205 16:36:21.020340 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"343138e2eb4ae3926491532e53293f0ab41237a5fbd9065af12a59848488ad59\": container with ID starting with 343138e2eb4ae3926491532e53293f0ab41237a5fbd9065af12a59848488ad59 not found: ID does not exist" containerID="343138e2eb4ae3926491532e53293f0ab41237a5fbd9065af12a59848488ad59" Dec 05 16:36:21 crc kubenswrapper[4778]: I1205 16:36:21.020387 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"343138e2eb4ae3926491532e53293f0ab41237a5fbd9065af12a59848488ad59"} err="failed to get container status \"343138e2eb4ae3926491532e53293f0ab41237a5fbd9065af12a59848488ad59\": rpc error: code = NotFound desc = could not find container \"343138e2eb4ae3926491532e53293f0ab41237a5fbd9065af12a59848488ad59\": container with ID starting with 343138e2eb4ae3926491532e53293f0ab41237a5fbd9065af12a59848488ad59 not found: ID does not exist" Dec 05 16:36:21 crc kubenswrapper[4778]: I1205 16:36:21.020431 4778 scope.go:117] "RemoveContainer" containerID="1fc241b7718584a46d6661a39bd29ca4f09e943372767b2a0cbadc8ba29c2e2f" Dec 05 16:36:21 crc kubenswrapper[4778]: E1205 16:36:21.020699 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fc241b7718584a46d6661a39bd29ca4f09e943372767b2a0cbadc8ba29c2e2f\": container with ID starting with 1fc241b7718584a46d6661a39bd29ca4f09e943372767b2a0cbadc8ba29c2e2f not found: ID does not exist" containerID="1fc241b7718584a46d6661a39bd29ca4f09e943372767b2a0cbadc8ba29c2e2f" Dec 05 16:36:21 crc kubenswrapper[4778]: I1205 16:36:21.020748 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc241b7718584a46d6661a39bd29ca4f09e943372767b2a0cbadc8ba29c2e2f"} err="failed to get container status \"1fc241b7718584a46d6661a39bd29ca4f09e943372767b2a0cbadc8ba29c2e2f\": rpc error: code = NotFound desc = could not find container \"1fc241b7718584a46d6661a39bd29ca4f09e943372767b2a0cbadc8ba29c2e2f\": container with ID starting with 1fc241b7718584a46d6661a39bd29ca4f09e943372767b2a0cbadc8ba29c2e2f not found: ID does not exist" Dec 05 16:36:21 crc kubenswrapper[4778]: I1205 16:36:21.259291 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" path="/var/lib/kubelet/pods/83b5e71a-c4e5-4da7-bd2c-d06ef73d657c/volumes" Dec 05 16:36:21 crc kubenswrapper[4778]: I1205 16:36:21.937766 4778 generic.go:334] "Generic (PLEG): container finished" podID="233d3868-b9e1-4500-8f58-101a60f83778" containerID="b6592f42fe79dcc3e52b2c8c1ae84ee4bad362a15a4239e63110eb0b55880bbf" exitCode=1 Dec 05 16:36:21 crc kubenswrapper[4778]: I1205 16:36:21.937812 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerDied","Data":"b6592f42fe79dcc3e52b2c8c1ae84ee4bad362a15a4239e63110eb0b55880bbf"} Dec 05 16:36:21 crc kubenswrapper[4778]: I1205 16:36:21.937891 4778 scope.go:117] "RemoveContainer" containerID="b548a1e38b1eab8a4703cec78875747636890351ce77d9863c3ba9ad610c2b06" Dec 05 16:36:21 crc kubenswrapper[4778]: I1205 16:36:21.938722 4778 scope.go:117] "RemoveContainer" containerID="b6592f42fe79dcc3e52b2c8c1ae84ee4bad362a15a4239e63110eb0b55880bbf" Dec 05 16:36:21 crc kubenswrapper[4778]: E1205 16:36:21.939187 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:36:24 crc kubenswrapper[4778]: I1205 16:36:24.479015 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:24 crc kubenswrapper[4778]: I1205 16:36:24.485148 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:24 crc kubenswrapper[4778]: I1205 16:36:24.485430 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:24 crc kubenswrapper[4778]: I1205 16:36:24.490262 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:24 crc kubenswrapper[4778]: I1205 16:36:24.563566 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:24 crc kubenswrapper[4778]: I1205 16:36:24.585938 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:24 crc kubenswrapper[4778]: I1205 16:36:24.633666 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:24 crc kubenswrapper[4778]: I1205 16:36:24.633710 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:24 crc kubenswrapper[4778]: I1205 16:36:24.634243 4778 scope.go:117] "RemoveContainer" containerID="b6592f42fe79dcc3e52b2c8c1ae84ee4bad362a15a4239e63110eb0b55880bbf" Dec 05 16:36:24 crc kubenswrapper[4778]: E1205 16:36:24.634510 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:36:24 crc kubenswrapper[4778]: I1205 16:36:24.972659 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:36:24 crc kubenswrapper[4778]: I1205 16:36:24.974646 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:36:25 crc kubenswrapper[4778]: I1205 16:36:25.023692 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:36:33 crc kubenswrapper[4778]: I1205 16:36:33.254285 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:36:33 crc kubenswrapper[4778]: E1205 16:36:33.255082 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:36:35 crc kubenswrapper[4778]: I1205 16:36:35.250386 4778 scope.go:117] "RemoveContainer" containerID="b6592f42fe79dcc3e52b2c8c1ae84ee4bad362a15a4239e63110eb0b55880bbf" Dec 05 16:36:36 crc kubenswrapper[4778]: I1205 16:36:36.090674 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerStarted","Data":"6fdc213c7173be476fe1f2e98685fcf1eda2f9f3148268e8d576ab1fada0a673"} Dec 05 16:36:38 crc kubenswrapper[4778]: E1205 16:36:38.492851 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod233d3868_b9e1_4500_8f58_101a60f83778.slice/crio-conmon-6fdc213c7173be476fe1f2e98685fcf1eda2f9f3148268e8d576ab1fada0a673.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:36:39 crc kubenswrapper[4778]: I1205 16:36:39.117863 4778 generic.go:334] "Generic (PLEG): container finished" podID="233d3868-b9e1-4500-8f58-101a60f83778" containerID="6fdc213c7173be476fe1f2e98685fcf1eda2f9f3148268e8d576ab1fada0a673" exitCode=1 Dec 05 16:36:39 crc kubenswrapper[4778]: I1205 16:36:39.117940 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerDied","Data":"6fdc213c7173be476fe1f2e98685fcf1eda2f9f3148268e8d576ab1fada0a673"} Dec 05 16:36:39 crc kubenswrapper[4778]: I1205 16:36:39.117998 4778 scope.go:117] "RemoveContainer" containerID="b6592f42fe79dcc3e52b2c8c1ae84ee4bad362a15a4239e63110eb0b55880bbf" Dec 05 16:36:39 crc kubenswrapper[4778]: I1205 16:36:39.119167 4778 scope.go:117] "RemoveContainer" containerID="6fdc213c7173be476fe1f2e98685fcf1eda2f9f3148268e8d576ab1fada0a673" Dec 05 16:36:39 crc kubenswrapper[4778]: E1205 16:36:39.119661 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:36:44 crc kubenswrapper[4778]: I1205 16:36:44.633607 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:44 crc kubenswrapper[4778]: I1205 16:36:44.633906 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:44 crc kubenswrapper[4778]: I1205 16:36:44.633924 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:44 crc kubenswrapper[4778]: I1205 16:36:44.633932 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:36:44 crc kubenswrapper[4778]: I1205 16:36:44.634550 4778 scope.go:117] "RemoveContainer" containerID="6fdc213c7173be476fe1f2e98685fcf1eda2f9f3148268e8d576ab1fada0a673" Dec 05 16:36:44 crc kubenswrapper[4778]: E1205 16:36:44.634793 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:36:45 crc kubenswrapper[4778]: I1205 16:36:45.249394 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:36:45 crc kubenswrapper[4778]: E1205 16:36:45.250085 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:36:56 crc kubenswrapper[4778]: I1205 16:36:56.249921 4778 scope.go:117] "RemoveContainer" containerID="6fdc213c7173be476fe1f2e98685fcf1eda2f9f3148268e8d576ab1fada0a673" Dec 05 16:36:56 crc kubenswrapper[4778]: E1205 16:36:56.250802 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:36:59 crc kubenswrapper[4778]: I1205 16:36:59.250059 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:36:59 crc kubenswrapper[4778]: E1205 16:36:59.251004 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:37:10 crc kubenswrapper[4778]: I1205 16:37:10.249351 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:37:11 crc kubenswrapper[4778]: I1205 16:37:11.249716 4778 scope.go:117] "RemoveContainer" containerID="6fdc213c7173be476fe1f2e98685fcf1eda2f9f3148268e8d576ab1fada0a673" Dec 05 16:37:11 crc kubenswrapper[4778]: I1205 16:37:11.380663 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"0252113aba7ac5b30976ddfa801607e96d4386b08fb7868f7c833a29836c7593"} Dec 05 16:37:12 crc kubenswrapper[4778]: I1205 16:37:12.390726 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerStarted","Data":"9f0f0a4e0f398774e35bcfc22540a4cd1ed1aeba16e16350a8e3789d898b9380"} Dec 05 16:37:14 crc kubenswrapper[4778]: I1205 16:37:14.417246 4778 generic.go:334] "Generic (PLEG): container finished" podID="233d3868-b9e1-4500-8f58-101a60f83778" containerID="9f0f0a4e0f398774e35bcfc22540a4cd1ed1aeba16e16350a8e3789d898b9380" exitCode=1 Dec 05 16:37:14 crc kubenswrapper[4778]: I1205 16:37:14.417469 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerDied","Data":"9f0f0a4e0f398774e35bcfc22540a4cd1ed1aeba16e16350a8e3789d898b9380"} Dec 05 16:37:14 crc kubenswrapper[4778]: I1205 16:37:14.417841 4778 scope.go:117] "RemoveContainer" containerID="6fdc213c7173be476fe1f2e98685fcf1eda2f9f3148268e8d576ab1fada0a673" Dec 05 16:37:14 crc kubenswrapper[4778]: I1205 16:37:14.418413 4778 scope.go:117] "RemoveContainer" containerID="9f0f0a4e0f398774e35bcfc22540a4cd1ed1aeba16e16350a8e3789d898b9380" Dec 05 16:37:14 crc kubenswrapper[4778]: E1205 16:37:14.418655 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:37:14 crc kubenswrapper[4778]: I1205 16:37:14.634045 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:37:14 crc kubenswrapper[4778]: I1205 16:37:14.634078 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:37:14 crc kubenswrapper[4778]: I1205 16:37:14.634088 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:37:14 crc kubenswrapper[4778]: I1205 16:37:14.634096 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:37:15 crc kubenswrapper[4778]: I1205 16:37:15.427547 4778 scope.go:117] "RemoveContainer" containerID="9f0f0a4e0f398774e35bcfc22540a4cd1ed1aeba16e16350a8e3789d898b9380" Dec 05 16:37:15 crc kubenswrapper[4778]: E1205 16:37:15.428611 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:37:16 crc kubenswrapper[4778]: I1205 16:37:16.434286 4778 scope.go:117] "RemoveContainer" containerID="9f0f0a4e0f398774e35bcfc22540a4cd1ed1aeba16e16350a8e3789d898b9380" Dec 05 16:37:16 crc kubenswrapper[4778]: E1205 16:37:16.434542 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:37:30 crc kubenswrapper[4778]: I1205 16:37:30.249947 4778 scope.go:117] "RemoveContainer" containerID="9f0f0a4e0f398774e35bcfc22540a4cd1ed1aeba16e16350a8e3789d898b9380" Dec 05 16:37:30 crc kubenswrapper[4778]: E1205 16:37:30.250833 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:37:42 crc kubenswrapper[4778]: I1205 16:37:42.249865 4778 scope.go:117] "RemoveContainer" containerID="9f0f0a4e0f398774e35bcfc22540a4cd1ed1aeba16e16350a8e3789d898b9380" Dec 05 16:37:42 crc kubenswrapper[4778]: E1205 16:37:42.250479 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:37:48 crc kubenswrapper[4778]: I1205 16:37:48.385126 4778 scope.go:117] "RemoveContainer" containerID="6ac7b39e755a82306960e7fc086adaa8c4d26755dbe978f26ed5cee5d0504fcf" Dec 05 16:37:48 crc kubenswrapper[4778]: I1205 16:37:48.431533 4778 scope.go:117] "RemoveContainer" containerID="7160fe385086b3b87f7c8ddeeef79fe328ce041405f1121952ad3a8ba05ca9ce" Dec 05 16:37:48 crc kubenswrapper[4778]: I1205 16:37:48.480321 4778 scope.go:117] "RemoveContainer" containerID="d85bc21af86055dc7c9fb771fd32cfac038318826a7ac50e0a6e7a35d85466c8" Dec 05 16:37:48 crc kubenswrapper[4778]: I1205 16:37:48.498107 4778 scope.go:117] "RemoveContainer" containerID="388d96eda6e3b2d009e48735e676c2b1521389e5f7b25e5ad22cab702b7cd001" Dec 05 16:37:55 crc kubenswrapper[4778]: I1205 16:37:55.253891 4778 scope.go:117] "RemoveContainer" containerID="9f0f0a4e0f398774e35bcfc22540a4cd1ed1aeba16e16350a8e3789d898b9380" Dec 05 16:37:55 crc kubenswrapper[4778]: I1205 16:37:55.768854 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerStarted","Data":"1ecf92dd4714ecf93a1ba2f43a38f291764a25821f0242760fe360d9df04235a"} Dec 05 16:37:58 crc kubenswrapper[4778]: I1205 16:37:58.797880 4778 generic.go:334] "Generic (PLEG): container finished" podID="233d3868-b9e1-4500-8f58-101a60f83778" containerID="1ecf92dd4714ecf93a1ba2f43a38f291764a25821f0242760fe360d9df04235a" exitCode=1 Dec 05 16:37:58 crc kubenswrapper[4778]: I1205 16:37:58.797951 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerDied","Data":"1ecf92dd4714ecf93a1ba2f43a38f291764a25821f0242760fe360d9df04235a"} Dec 05 16:37:58 crc kubenswrapper[4778]: I1205 16:37:58.798486 4778 scope.go:117] "RemoveContainer" containerID="9f0f0a4e0f398774e35bcfc22540a4cd1ed1aeba16e16350a8e3789d898b9380" Dec 05 16:37:58 crc kubenswrapper[4778]: I1205 16:37:58.799350 4778 scope.go:117] "RemoveContainer" containerID="1ecf92dd4714ecf93a1ba2f43a38f291764a25821f0242760fe360d9df04235a" Dec 05 16:37:58 crc kubenswrapper[4778]: E1205 16:37:58.799873 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:38:04 crc kubenswrapper[4778]: I1205 16:38:04.633539 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:38:04 crc kubenswrapper[4778]: I1205 16:38:04.634133 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:38:04 crc kubenswrapper[4778]: I1205 16:38:04.634840 4778 scope.go:117] "RemoveContainer" containerID="1ecf92dd4714ecf93a1ba2f43a38f291764a25821f0242760fe360d9df04235a" Dec 05 16:38:04 crc kubenswrapper[4778]: E1205 16:38:04.635141 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:38:14 crc kubenswrapper[4778]: I1205 16:38:14.633605 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:38:14 crc kubenswrapper[4778]: I1205 16:38:14.634234 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:38:14 crc kubenswrapper[4778]: I1205 16:38:14.634927 4778 scope.go:117] "RemoveContainer" containerID="1ecf92dd4714ecf93a1ba2f43a38f291764a25821f0242760fe360d9df04235a" Dec 05 16:38:14 crc kubenswrapper[4778]: E1205 16:38:14.635208 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:38:29 crc kubenswrapper[4778]: I1205 16:38:29.249956 4778 scope.go:117] "RemoveContainer" containerID="1ecf92dd4714ecf93a1ba2f43a38f291764a25821f0242760fe360d9df04235a" Dec 05 16:38:29 crc kubenswrapper[4778]: E1205 16:38:29.250642 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:38:40 crc kubenswrapper[4778]: I1205 16:38:40.249480 4778 scope.go:117] "RemoveContainer" containerID="1ecf92dd4714ecf93a1ba2f43a38f291764a25821f0242760fe360d9df04235a" Dec 05 16:38:40 crc kubenswrapper[4778]: E1205 16:38:40.268279 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:38:54 crc kubenswrapper[4778]: I1205 16:38:54.250005 4778 scope.go:117] "RemoveContainer" containerID="1ecf92dd4714ecf93a1ba2f43a38f291764a25821f0242760fe360d9df04235a" Dec 05 16:38:54 crc kubenswrapper[4778]: E1205 16:38:54.250714 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:39:06 crc kubenswrapper[4778]: I1205 16:39:06.249572 4778 scope.go:117] "RemoveContainer" containerID="1ecf92dd4714ecf93a1ba2f43a38f291764a25821f0242760fe360d9df04235a" Dec 05 16:39:06 crc kubenswrapper[4778]: E1205 16:39:06.250921 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.135485 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f9hxq"] Dec 05 16:39:11 crc kubenswrapper[4778]: E1205 16:39:11.136599 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" containerName="extract-utilities" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.136619 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" containerName="extract-utilities" Dec 05 16:39:11 crc kubenswrapper[4778]: E1205 16:39:11.136642 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" containerName="extract-content" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.136652 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" containerName="extract-content" Dec 05 16:39:11 crc kubenswrapper[4778]: E1205 16:39:11.136671 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" containerName="registry-server" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.136680 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" containerName="registry-server" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.136882 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="83b5e71a-c4e5-4da7-bd2c-d06ef73d657c" containerName="registry-server" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.138591 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.156603 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f9hxq"] Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.241528 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-catalog-content\") pod \"certified-operators-f9hxq\" (UID: \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\") " pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.241591 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqhc6\" (UniqueName: \"kubernetes.io/projected/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-kube-api-access-vqhc6\") pod \"certified-operators-f9hxq\" (UID: \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\") " pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.241947 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-utilities\") pod \"certified-operators-f9hxq\" (UID: \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\") " pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.343125 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-utilities\") pod \"certified-operators-f9hxq\" (UID: \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\") " pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.343197 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-catalog-content\") pod \"certified-operators-f9hxq\" (UID: \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\") " pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.343246 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqhc6\" (UniqueName: \"kubernetes.io/projected/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-kube-api-access-vqhc6\") pod \"certified-operators-f9hxq\" (UID: \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\") " pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.343758 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-utilities\") pod \"certified-operators-f9hxq\" (UID: \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\") " pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.343858 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-catalog-content\") pod \"certified-operators-f9hxq\" (UID: \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\") " pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.366162 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqhc6\" (UniqueName: \"kubernetes.io/projected/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-kube-api-access-vqhc6\") pod \"certified-operators-f9hxq\" (UID: \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\") " pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.469716 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:11 crc kubenswrapper[4778]: I1205 16:39:11.789965 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f9hxq"] Dec 05 16:39:12 crc kubenswrapper[4778]: I1205 16:39:12.497895 4778 generic.go:334] "Generic (PLEG): container finished" podID="71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" containerID="ac5e8e1468fa38885ac8d9ac65b56f4a30e3ea8071ff7aeefd9432035dc6f02f" exitCode=0 Dec 05 16:39:12 crc kubenswrapper[4778]: I1205 16:39:12.497988 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9hxq" event={"ID":"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6","Type":"ContainerDied","Data":"ac5e8e1468fa38885ac8d9ac65b56f4a30e3ea8071ff7aeefd9432035dc6f02f"} Dec 05 16:39:12 crc kubenswrapper[4778]: I1205 16:39:12.498245 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9hxq" event={"ID":"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6","Type":"ContainerStarted","Data":"668648f1a779e01409b2ba045f6e9da75157a40e4a63639fca8752fd5d5939d4"} Dec 05 16:39:13 crc kubenswrapper[4778]: I1205 16:39:13.509779 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9hxq" event={"ID":"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6","Type":"ContainerStarted","Data":"549d5e60617bbc84bd4999cd1da487edb77c5c8fd300d86c574ca5db529a5bcf"} Dec 05 16:39:14 crc kubenswrapper[4778]: I1205 16:39:14.518998 4778 generic.go:334] "Generic (PLEG): container finished" podID="71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" containerID="549d5e60617bbc84bd4999cd1da487edb77c5c8fd300d86c574ca5db529a5bcf" exitCode=0 Dec 05 16:39:14 crc kubenswrapper[4778]: I1205 16:39:14.519066 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9hxq" event={"ID":"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6","Type":"ContainerDied","Data":"549d5e60617bbc84bd4999cd1da487edb77c5c8fd300d86c574ca5db529a5bcf"} Dec 05 16:39:15 crc kubenswrapper[4778]: I1205 16:39:15.528311 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9hxq" event={"ID":"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6","Type":"ContainerStarted","Data":"3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417"} Dec 05 16:39:15 crc kubenswrapper[4778]: I1205 16:39:15.552531 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f9hxq" podStartSLOduration=2.013843625 podStartE2EDuration="4.552510826s" podCreationTimestamp="2025-12-05 16:39:11 +0000 UTC" firstStartedPulling="2025-12-05 16:39:12.499911137 +0000 UTC m=+2639.603707517" lastFinishedPulling="2025-12-05 16:39:15.038578338 +0000 UTC m=+2642.142374718" observedRunningTime="2025-12-05 16:39:15.549503373 +0000 UTC m=+2642.653299763" watchObservedRunningTime="2025-12-05 16:39:15.552510826 +0000 UTC m=+2642.656307206" Dec 05 16:39:19 crc kubenswrapper[4778]: I1205 16:39:19.249568 4778 scope.go:117] "RemoveContainer" containerID="1ecf92dd4714ecf93a1ba2f43a38f291764a25821f0242760fe360d9df04235a" Dec 05 16:39:19 crc kubenswrapper[4778]: I1205 16:39:19.561013 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerStarted","Data":"cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64"} Dec 05 16:39:21 crc kubenswrapper[4778]: I1205 16:39:21.469967 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:21 crc kubenswrapper[4778]: I1205 16:39:21.470294 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:21 crc kubenswrapper[4778]: I1205 16:39:21.516934 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:21 crc kubenswrapper[4778]: I1205 16:39:21.727708 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:22 crc kubenswrapper[4778]: I1205 16:39:22.629982 4778 generic.go:334] "Generic (PLEG): container finished" podID="233d3868-b9e1-4500-8f58-101a60f83778" containerID="cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64" exitCode=1 Dec 05 16:39:22 crc kubenswrapper[4778]: I1205 16:39:22.630282 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerDied","Data":"cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64"} Dec 05 16:39:22 crc kubenswrapper[4778]: I1205 16:39:22.630331 4778 scope.go:117] "RemoveContainer" containerID="1ecf92dd4714ecf93a1ba2f43a38f291764a25821f0242760fe360d9df04235a" Dec 05 16:39:22 crc kubenswrapper[4778]: I1205 16:39:22.631087 4778 scope.go:117] "RemoveContainer" containerID="cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64" Dec 05 16:39:22 crc kubenswrapper[4778]: E1205 16:39:22.631323 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:39:24 crc kubenswrapper[4778]: I1205 16:39:24.633687 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:39:24 crc kubenswrapper[4778]: I1205 16:39:24.634012 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:39:24 crc kubenswrapper[4778]: I1205 16:39:24.634630 4778 scope.go:117] "RemoveContainer" containerID="cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64" Dec 05 16:39:24 crc kubenswrapper[4778]: E1205 16:39:24.634946 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.116177 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f9hxq"] Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.116435 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f9hxq" podUID="71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" containerName="registry-server" containerID="cri-o://3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417" gracePeriod=2 Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.555567 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.613045 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-catalog-content\") pod \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\" (UID: \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\") " Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.613112 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-utilities\") pod \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\" (UID: \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\") " Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.613146 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqhc6\" (UniqueName: \"kubernetes.io/projected/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-kube-api-access-vqhc6\") pod \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\" (UID: \"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6\") " Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.614139 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-utilities" (OuterVolumeSpecName: "utilities") pod "71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" (UID: "71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.624832 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-kube-api-access-vqhc6" (OuterVolumeSpecName: "kube-api-access-vqhc6") pod "71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" (UID: "71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6"). InnerVolumeSpecName "kube-api-access-vqhc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.659583 4778 generic.go:334] "Generic (PLEG): container finished" podID="71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" containerID="3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417" exitCode=0 Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.659633 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9hxq" event={"ID":"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6","Type":"ContainerDied","Data":"3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417"} Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.659656 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9hxq" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.659699 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9hxq" event={"ID":"71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6","Type":"ContainerDied","Data":"668648f1a779e01409b2ba045f6e9da75157a40e4a63639fca8752fd5d5939d4"} Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.659723 4778 scope.go:117] "RemoveContainer" containerID="3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.665735 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" (UID: "71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.683118 4778 scope.go:117] "RemoveContainer" containerID="549d5e60617bbc84bd4999cd1da487edb77c5c8fd300d86c574ca5db529a5bcf" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.704678 4778 scope.go:117] "RemoveContainer" containerID="ac5e8e1468fa38885ac8d9ac65b56f4a30e3ea8071ff7aeefd9432035dc6f02f" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.715664 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.715696 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.715707 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqhc6\" (UniqueName: \"kubernetes.io/projected/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6-kube-api-access-vqhc6\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.744807 4778 scope.go:117] "RemoveContainer" containerID="3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417" Dec 05 16:39:25 crc kubenswrapper[4778]: E1205 16:39:25.745298 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417\": container with ID starting with 3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417 not found: ID does not exist" containerID="3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.745350 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417"} err="failed to get container status \"3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417\": rpc error: code = NotFound desc = could not find container \"3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417\": container with ID starting with 3ef5b4cf9b5f8a0c5ebc29b37fb78eab0eb75ea60e28166cc7465c106d76d417 not found: ID does not exist" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.745397 4778 scope.go:117] "RemoveContainer" containerID="549d5e60617bbc84bd4999cd1da487edb77c5c8fd300d86c574ca5db529a5bcf" Dec 05 16:39:25 crc kubenswrapper[4778]: E1205 16:39:25.745728 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"549d5e60617bbc84bd4999cd1da487edb77c5c8fd300d86c574ca5db529a5bcf\": container with ID starting with 549d5e60617bbc84bd4999cd1da487edb77c5c8fd300d86c574ca5db529a5bcf not found: ID does not exist" containerID="549d5e60617bbc84bd4999cd1da487edb77c5c8fd300d86c574ca5db529a5bcf" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.745774 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"549d5e60617bbc84bd4999cd1da487edb77c5c8fd300d86c574ca5db529a5bcf"} err="failed to get container status \"549d5e60617bbc84bd4999cd1da487edb77c5c8fd300d86c574ca5db529a5bcf\": rpc error: code = NotFound desc = could not find container \"549d5e60617bbc84bd4999cd1da487edb77c5c8fd300d86c574ca5db529a5bcf\": container with ID starting with 549d5e60617bbc84bd4999cd1da487edb77c5c8fd300d86c574ca5db529a5bcf not found: ID does not exist" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.745789 4778 scope.go:117] "RemoveContainer" containerID="ac5e8e1468fa38885ac8d9ac65b56f4a30e3ea8071ff7aeefd9432035dc6f02f" Dec 05 16:39:25 crc kubenswrapper[4778]: E1205 16:39:25.746046 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5e8e1468fa38885ac8d9ac65b56f4a30e3ea8071ff7aeefd9432035dc6f02f\": container with ID starting with ac5e8e1468fa38885ac8d9ac65b56f4a30e3ea8071ff7aeefd9432035dc6f02f not found: ID does not exist" containerID="ac5e8e1468fa38885ac8d9ac65b56f4a30e3ea8071ff7aeefd9432035dc6f02f" Dec 05 16:39:25 crc kubenswrapper[4778]: I1205 16:39:25.746077 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5e8e1468fa38885ac8d9ac65b56f4a30e3ea8071ff7aeefd9432035dc6f02f"} err="failed to get container status \"ac5e8e1468fa38885ac8d9ac65b56f4a30e3ea8071ff7aeefd9432035dc6f02f\": rpc error: code = NotFound desc = could not find container \"ac5e8e1468fa38885ac8d9ac65b56f4a30e3ea8071ff7aeefd9432035dc6f02f\": container with ID starting with ac5e8e1468fa38885ac8d9ac65b56f4a30e3ea8071ff7aeefd9432035dc6f02f not found: ID does not exist" Dec 05 16:39:26 crc kubenswrapper[4778]: I1205 16:39:26.010135 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f9hxq"] Dec 05 16:39:26 crc kubenswrapper[4778]: I1205 16:39:26.024066 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f9hxq"] Dec 05 16:39:27 crc kubenswrapper[4778]: I1205 16:39:27.259005 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" path="/var/lib/kubelet/pods/71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6/volumes" Dec 05 16:39:33 crc kubenswrapper[4778]: I1205 16:39:33.414634 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:39:33 crc kubenswrapper[4778]: I1205 16:39:33.414964 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:39:36 crc kubenswrapper[4778]: I1205 16:39:36.249943 4778 scope.go:117] "RemoveContainer" containerID="cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64" Dec 05 16:39:36 crc kubenswrapper[4778]: E1205 16:39:36.250586 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:39:44 crc kubenswrapper[4778]: I1205 16:39:44.633415 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:39:44 crc kubenswrapper[4778]: I1205 16:39:44.633937 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:39:44 crc kubenswrapper[4778]: I1205 16:39:44.634513 4778 scope.go:117] "RemoveContainer" containerID="cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64" Dec 05 16:39:44 crc kubenswrapper[4778]: E1205 16:39:44.634705 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.721592 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pjw6h"] Dec 05 16:39:51 crc kubenswrapper[4778]: E1205 16:39:51.723637 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" containerName="extract-utilities" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.723757 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" containerName="extract-utilities" Dec 05 16:39:51 crc kubenswrapper[4778]: E1205 16:39:51.723869 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" containerName="registry-server" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.723946 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" containerName="registry-server" Dec 05 16:39:51 crc kubenswrapper[4778]: E1205 16:39:51.724042 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" containerName="extract-content" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.724117 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" containerName="extract-content" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.724418 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b6a8c3-3d71-4e9f-b1d0-49b0015cbcb6" containerName="registry-server" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.725982 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.737025 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjw6h"] Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.769734 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268096e2-14e3-4009-9679-c72badaf3b74-utilities\") pod \"redhat-operators-pjw6h\" (UID: \"268096e2-14e3-4009-9679-c72badaf3b74\") " pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.769931 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268096e2-14e3-4009-9679-c72badaf3b74-catalog-content\") pod \"redhat-operators-pjw6h\" (UID: \"268096e2-14e3-4009-9679-c72badaf3b74\") " pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.769977 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5v5q\" (UniqueName: \"kubernetes.io/projected/268096e2-14e3-4009-9679-c72badaf3b74-kube-api-access-z5v5q\") pod \"redhat-operators-pjw6h\" (UID: \"268096e2-14e3-4009-9679-c72badaf3b74\") " pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.872163 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268096e2-14e3-4009-9679-c72badaf3b74-catalog-content\") pod \"redhat-operators-pjw6h\" (UID: \"268096e2-14e3-4009-9679-c72badaf3b74\") " pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.872223 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5v5q\" (UniqueName: \"kubernetes.io/projected/268096e2-14e3-4009-9679-c72badaf3b74-kube-api-access-z5v5q\") pod \"redhat-operators-pjw6h\" (UID: \"268096e2-14e3-4009-9679-c72badaf3b74\") " pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.872274 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268096e2-14e3-4009-9679-c72badaf3b74-utilities\") pod \"redhat-operators-pjw6h\" (UID: \"268096e2-14e3-4009-9679-c72badaf3b74\") " pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.872632 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268096e2-14e3-4009-9679-c72badaf3b74-catalog-content\") pod \"redhat-operators-pjw6h\" (UID: \"268096e2-14e3-4009-9679-c72badaf3b74\") " pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.872706 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268096e2-14e3-4009-9679-c72badaf3b74-utilities\") pod \"redhat-operators-pjw6h\" (UID: \"268096e2-14e3-4009-9679-c72badaf3b74\") " pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:39:51 crc kubenswrapper[4778]: I1205 16:39:51.891588 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5v5q\" (UniqueName: \"kubernetes.io/projected/268096e2-14e3-4009-9679-c72badaf3b74-kube-api-access-z5v5q\") pod \"redhat-operators-pjw6h\" (UID: \"268096e2-14e3-4009-9679-c72badaf3b74\") " pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:39:52 crc kubenswrapper[4778]: I1205 16:39:52.048062 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:39:52 crc kubenswrapper[4778]: I1205 16:39:52.466072 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjw6h"] Dec 05 16:39:52 crc kubenswrapper[4778]: I1205 16:39:52.477104 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw6h" event={"ID":"268096e2-14e3-4009-9679-c72badaf3b74","Type":"ContainerStarted","Data":"c81b68f7c8c60e6d8552a18d82bad77b4547a0b8417682ee77afe391dd74d685"} Dec 05 16:39:53 crc kubenswrapper[4778]: I1205 16:39:53.488735 4778 generic.go:334] "Generic (PLEG): container finished" podID="268096e2-14e3-4009-9679-c72badaf3b74" containerID="26d39f2fed8de1fd5ff498fa6981e385ea392635ba4b38bb943fbbeb8c928dba" exitCode=0 Dec 05 16:39:53 crc kubenswrapper[4778]: I1205 16:39:53.489293 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw6h" event={"ID":"268096e2-14e3-4009-9679-c72badaf3b74","Type":"ContainerDied","Data":"26d39f2fed8de1fd5ff498fa6981e385ea392635ba4b38bb943fbbeb8c928dba"} Dec 05 16:39:54 crc kubenswrapper[4778]: I1205 16:39:54.506572 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw6h" event={"ID":"268096e2-14e3-4009-9679-c72badaf3b74","Type":"ContainerStarted","Data":"a074b4b690115d2d4aa3bf9e82743ea49c52535886ae6a41946f37fa70855aca"} Dec 05 16:39:55 crc kubenswrapper[4778]: I1205 16:39:55.520086 4778 generic.go:334] "Generic (PLEG): container finished" podID="268096e2-14e3-4009-9679-c72badaf3b74" containerID="a074b4b690115d2d4aa3bf9e82743ea49c52535886ae6a41946f37fa70855aca" exitCode=0 Dec 05 16:39:55 crc kubenswrapper[4778]: I1205 16:39:55.520126 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw6h" event={"ID":"268096e2-14e3-4009-9679-c72badaf3b74","Type":"ContainerDied","Data":"a074b4b690115d2d4aa3bf9e82743ea49c52535886ae6a41946f37fa70855aca"} Dec 05 16:39:58 crc kubenswrapper[4778]: I1205 16:39:58.543701 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw6h" event={"ID":"268096e2-14e3-4009-9679-c72badaf3b74","Type":"ContainerStarted","Data":"bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a"} Dec 05 16:39:58 crc kubenswrapper[4778]: I1205 16:39:58.567812 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pjw6h" podStartSLOduration=4.571065345 podStartE2EDuration="7.567794569s" podCreationTimestamp="2025-12-05 16:39:51 +0000 UTC" firstStartedPulling="2025-12-05 16:39:53.491077172 +0000 UTC m=+2680.594873562" lastFinishedPulling="2025-12-05 16:39:56.487806406 +0000 UTC m=+2683.591602786" observedRunningTime="2025-12-05 16:39:58.560780007 +0000 UTC m=+2685.664576417" watchObservedRunningTime="2025-12-05 16:39:58.567794569 +0000 UTC m=+2685.671590949" Dec 05 16:40:00 crc kubenswrapper[4778]: I1205 16:40:00.249646 4778 scope.go:117] "RemoveContainer" containerID="cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64" Dec 05 16:40:00 crc kubenswrapper[4778]: E1205 16:40:00.250219 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:40:02 crc kubenswrapper[4778]: I1205 16:40:02.048611 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:40:02 crc kubenswrapper[4778]: I1205 16:40:02.048967 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:40:03 crc kubenswrapper[4778]: I1205 16:40:03.111510 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pjw6h" podUID="268096e2-14e3-4009-9679-c72badaf3b74" containerName="registry-server" probeResult="failure" output=< Dec 05 16:40:03 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Dec 05 16:40:03 crc kubenswrapper[4778]: > Dec 05 16:40:03 crc kubenswrapper[4778]: I1205 16:40:03.414914 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:40:03 crc kubenswrapper[4778]: I1205 16:40:03.415019 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:40:12 crc kubenswrapper[4778]: I1205 16:40:12.105564 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:40:12 crc kubenswrapper[4778]: I1205 16:40:12.166621 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:40:13 crc kubenswrapper[4778]: I1205 16:40:13.264983 4778 scope.go:117] "RemoveContainer" containerID="cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64" Dec 05 16:40:13 crc kubenswrapper[4778]: E1205 16:40:13.265315 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:40:15 crc kubenswrapper[4778]: I1205 16:40:15.921276 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pjw6h"] Dec 05 16:40:15 crc kubenswrapper[4778]: I1205 16:40:15.922232 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pjw6h" podUID="268096e2-14e3-4009-9679-c72badaf3b74" containerName="registry-server" containerID="cri-o://bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a" gracePeriod=2 Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.449561 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.598192 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268096e2-14e3-4009-9679-c72badaf3b74-utilities\") pod \"268096e2-14e3-4009-9679-c72badaf3b74\" (UID: \"268096e2-14e3-4009-9679-c72badaf3b74\") " Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.598296 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5v5q\" (UniqueName: \"kubernetes.io/projected/268096e2-14e3-4009-9679-c72badaf3b74-kube-api-access-z5v5q\") pod \"268096e2-14e3-4009-9679-c72badaf3b74\" (UID: \"268096e2-14e3-4009-9679-c72badaf3b74\") " Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.598318 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268096e2-14e3-4009-9679-c72badaf3b74-catalog-content\") pod \"268096e2-14e3-4009-9679-c72badaf3b74\" (UID: \"268096e2-14e3-4009-9679-c72badaf3b74\") " Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.599668 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/268096e2-14e3-4009-9679-c72badaf3b74-utilities" (OuterVolumeSpecName: "utilities") pod "268096e2-14e3-4009-9679-c72badaf3b74" (UID: "268096e2-14e3-4009-9679-c72badaf3b74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.604432 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268096e2-14e3-4009-9679-c72badaf3b74-kube-api-access-z5v5q" (OuterVolumeSpecName: "kube-api-access-z5v5q") pod "268096e2-14e3-4009-9679-c72badaf3b74" (UID: "268096e2-14e3-4009-9679-c72badaf3b74"). InnerVolumeSpecName "kube-api-access-z5v5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.690464 4778 generic.go:334] "Generic (PLEG): container finished" podID="268096e2-14e3-4009-9679-c72badaf3b74" containerID="bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a" exitCode=0 Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.690521 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw6h" event={"ID":"268096e2-14e3-4009-9679-c72badaf3b74","Type":"ContainerDied","Data":"bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a"} Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.690554 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjw6h" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.690588 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw6h" event={"ID":"268096e2-14e3-4009-9679-c72badaf3b74","Type":"ContainerDied","Data":"c81b68f7c8c60e6d8552a18d82bad77b4547a0b8417682ee77afe391dd74d685"} Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.690611 4778 scope.go:117] "RemoveContainer" containerID="bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.699924 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5v5q\" (UniqueName: \"kubernetes.io/projected/268096e2-14e3-4009-9679-c72badaf3b74-kube-api-access-z5v5q\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.700163 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268096e2-14e3-4009-9679-c72badaf3b74-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.710239 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/268096e2-14e3-4009-9679-c72badaf3b74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "268096e2-14e3-4009-9679-c72badaf3b74" (UID: "268096e2-14e3-4009-9679-c72badaf3b74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.711311 4778 scope.go:117] "RemoveContainer" containerID="a074b4b690115d2d4aa3bf9e82743ea49c52535886ae6a41946f37fa70855aca" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.731315 4778 scope.go:117] "RemoveContainer" containerID="26d39f2fed8de1fd5ff498fa6981e385ea392635ba4b38bb943fbbeb8c928dba" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.763497 4778 scope.go:117] "RemoveContainer" containerID="bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a" Dec 05 16:40:16 crc kubenswrapper[4778]: E1205 16:40:16.764014 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a\": container with ID starting with bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a not found: ID does not exist" containerID="bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.764066 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a"} err="failed to get container status \"bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a\": rpc error: code = NotFound desc = could not find container \"bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a\": container with ID starting with bae1570e8a609dc6ebfa4a83dca68d6863a7ed073d3a453b90e6e3c8eec42b6a not found: ID does not exist" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.764203 4778 scope.go:117] "RemoveContainer" containerID="a074b4b690115d2d4aa3bf9e82743ea49c52535886ae6a41946f37fa70855aca" Dec 05 16:40:16 crc kubenswrapper[4778]: E1205 16:40:16.764587 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a074b4b690115d2d4aa3bf9e82743ea49c52535886ae6a41946f37fa70855aca\": container with ID starting with a074b4b690115d2d4aa3bf9e82743ea49c52535886ae6a41946f37fa70855aca not found: ID does not exist" containerID="a074b4b690115d2d4aa3bf9e82743ea49c52535886ae6a41946f37fa70855aca" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.764621 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a074b4b690115d2d4aa3bf9e82743ea49c52535886ae6a41946f37fa70855aca"} err="failed to get container status \"a074b4b690115d2d4aa3bf9e82743ea49c52535886ae6a41946f37fa70855aca\": rpc error: code = NotFound desc = could not find container \"a074b4b690115d2d4aa3bf9e82743ea49c52535886ae6a41946f37fa70855aca\": container with ID starting with a074b4b690115d2d4aa3bf9e82743ea49c52535886ae6a41946f37fa70855aca not found: ID does not exist" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.764646 4778 scope.go:117] "RemoveContainer" containerID="26d39f2fed8de1fd5ff498fa6981e385ea392635ba4b38bb943fbbeb8c928dba" Dec 05 16:40:16 crc kubenswrapper[4778]: E1205 16:40:16.765103 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d39f2fed8de1fd5ff498fa6981e385ea392635ba4b38bb943fbbeb8c928dba\": container with ID starting with 26d39f2fed8de1fd5ff498fa6981e385ea392635ba4b38bb943fbbeb8c928dba not found: ID does not exist" containerID="26d39f2fed8de1fd5ff498fa6981e385ea392635ba4b38bb943fbbeb8c928dba" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.765130 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d39f2fed8de1fd5ff498fa6981e385ea392635ba4b38bb943fbbeb8c928dba"} err="failed to get container status \"26d39f2fed8de1fd5ff498fa6981e385ea392635ba4b38bb943fbbeb8c928dba\": rpc error: code = NotFound desc = could not find container \"26d39f2fed8de1fd5ff498fa6981e385ea392635ba4b38bb943fbbeb8c928dba\": container with ID starting with 26d39f2fed8de1fd5ff498fa6981e385ea392635ba4b38bb943fbbeb8c928dba not found: ID does not exist" Dec 05 16:40:16 crc kubenswrapper[4778]: I1205 16:40:16.801200 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268096e2-14e3-4009-9679-c72badaf3b74-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:17 crc kubenswrapper[4778]: I1205 16:40:17.033697 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pjw6h"] Dec 05 16:40:17 crc kubenswrapper[4778]: I1205 16:40:17.044907 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pjw6h"] Dec 05 16:40:17 crc kubenswrapper[4778]: I1205 16:40:17.264068 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="268096e2-14e3-4009-9679-c72badaf3b74" path="/var/lib/kubelet/pods/268096e2-14e3-4009-9679-c72badaf3b74/volumes" Dec 05 16:40:25 crc kubenswrapper[4778]: I1205 16:40:25.249512 4778 scope.go:117] "RemoveContainer" containerID="cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64" Dec 05 16:40:25 crc kubenswrapper[4778]: E1205 16:40:25.250399 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:40:33 crc kubenswrapper[4778]: I1205 16:40:33.414600 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:40:33 crc kubenswrapper[4778]: I1205 16:40:33.416650 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:40:33 crc kubenswrapper[4778]: I1205 16:40:33.416865 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:40:33 crc kubenswrapper[4778]: I1205 16:40:33.417906 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0252113aba7ac5b30976ddfa801607e96d4386b08fb7868f7c833a29836c7593"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:40:33 crc kubenswrapper[4778]: I1205 16:40:33.418203 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://0252113aba7ac5b30976ddfa801607e96d4386b08fb7868f7c833a29836c7593" gracePeriod=600 Dec 05 16:40:33 crc kubenswrapper[4778]: E1205 16:40:33.712494 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode780ff27_1d00_4280_8e7e_9eb9fe3dea6e.slice/crio-conmon-0252113aba7ac5b30976ddfa801607e96d4386b08fb7868f7c833a29836c7593.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:40:33 crc kubenswrapper[4778]: I1205 16:40:33.842091 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="0252113aba7ac5b30976ddfa801607e96d4386b08fb7868f7c833a29836c7593" exitCode=0 Dec 05 16:40:33 crc kubenswrapper[4778]: I1205 16:40:33.842263 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"0252113aba7ac5b30976ddfa801607e96d4386b08fb7868f7c833a29836c7593"} Dec 05 16:40:33 crc kubenswrapper[4778]: I1205 16:40:33.842553 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5"} Dec 05 16:40:33 crc kubenswrapper[4778]: I1205 16:40:33.842594 4778 scope.go:117] "RemoveContainer" containerID="2f6e0c9fd5e98e81891d5c4a7d1796a5ecb020c57fabf4f2391805b8a20da93c" Dec 05 16:40:37 crc kubenswrapper[4778]: I1205 16:40:37.250552 4778 scope.go:117] "RemoveContainer" containerID="cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64" Dec 05 16:40:37 crc kubenswrapper[4778]: E1205 16:40:37.251266 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:40:48 crc kubenswrapper[4778]: I1205 16:40:48.249914 4778 scope.go:117] "RemoveContainer" containerID="cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64" Dec 05 16:40:48 crc kubenswrapper[4778]: E1205 16:40:48.250581 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(233d3868-b9e1-4500-8f58-101a60f83778)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="233d3868-b9e1-4500-8f58-101a60f83778" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.398577 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf"] Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.407718 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zj9vf"] Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.454003 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watchere51a-account-delete-rb7s8"] Dec 05 16:40:58 crc kubenswrapper[4778]: E1205 16:40:58.454500 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268096e2-14e3-4009-9679-c72badaf3b74" containerName="extract-content" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.454523 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="268096e2-14e3-4009-9679-c72badaf3b74" containerName="extract-content" Dec 05 16:40:58 crc kubenswrapper[4778]: E1205 16:40:58.454550 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268096e2-14e3-4009-9679-c72badaf3b74" containerName="registry-server" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.454558 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="268096e2-14e3-4009-9679-c72badaf3b74" containerName="registry-server" Dec 05 16:40:58 crc kubenswrapper[4778]: E1205 16:40:58.454574 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268096e2-14e3-4009-9679-c72badaf3b74" containerName="extract-utilities" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.454583 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="268096e2-14e3-4009-9679-c72badaf3b74" containerName="extract-utilities" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.454792 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="268096e2-14e3-4009-9679-c72badaf3b74" containerName="registry-server" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.455546 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.475785 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchere51a-account-delete-rb7s8"] Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.495299 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.495531 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="61de5783-faa3-466b-8121-69d6c8dcb01b" containerName="watcher-applier" containerID="cri-o://c946218ff5881070cb0218aceb13162a4c809c65d77f11a88834c149b5e58d12" gracePeriod=30 Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.571125 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.571548 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="0d620ac9-3637-4725-8ba8-bd2573ecd345" containerName="watcher-api" containerID="cri-o://59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40" gracePeriod=30 Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.571433 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="0d620ac9-3637-4725-8ba8-bd2573ecd345" containerName="watcher-kuttl-api-log" containerID="cri-o://ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e" gracePeriod=30 Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.577793 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.578016 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" containerName="watcher-kuttl-api-log" containerID="cri-o://093eea9ca1289144db233d06bdf0c20c91f488ca2e07beacc0855db06ecb65e1" gracePeriod=30 Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.578403 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" containerName="watcher-api" containerID="cri-o://1d46478fcd519b2f1645f0afe8864e337e6bc48b4e273bfc7928c0748ea9ea3f" gracePeriod=30 Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.597913 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadd04c1-2699-4ec5-9886-972167a1fb0b-operator-scripts\") pod \"watchere51a-account-delete-rb7s8\" (UID: \"dadd04c1-2699-4ec5-9886-972167a1fb0b\") " pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.598016 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bsn2\" (UniqueName: \"kubernetes.io/projected/dadd04c1-2699-4ec5-9886-972167a1fb0b-kube-api-access-7bsn2\") pod \"watchere51a-account-delete-rb7s8\" (UID: \"dadd04c1-2699-4ec5-9886-972167a1fb0b\") " pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.621589 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.699237 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadd04c1-2699-4ec5-9886-972167a1fb0b-operator-scripts\") pod \"watchere51a-account-delete-rb7s8\" (UID: \"dadd04c1-2699-4ec5-9886-972167a1fb0b\") " pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.699355 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bsn2\" (UniqueName: \"kubernetes.io/projected/dadd04c1-2699-4ec5-9886-972167a1fb0b-kube-api-access-7bsn2\") pod \"watchere51a-account-delete-rb7s8\" (UID: \"dadd04c1-2699-4ec5-9886-972167a1fb0b\") " pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.700141 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadd04c1-2699-4ec5-9886-972167a1fb0b-operator-scripts\") pod \"watchere51a-account-delete-rb7s8\" (UID: \"dadd04c1-2699-4ec5-9886-972167a1fb0b\") " pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.729789 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bsn2\" (UniqueName: \"kubernetes.io/projected/dadd04c1-2699-4ec5-9886-972167a1fb0b-kube-api-access-7bsn2\") pod \"watchere51a-account-delete-rb7s8\" (UID: \"dadd04c1-2699-4ec5-9886-972167a1fb0b\") " pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.768214 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_233d3868-b9e1-4500-8f58-101a60f83778/watcher-decision-engine/5.log" Dec 05 16:40:58 crc kubenswrapper[4778]: I1205 16:40:58.778586 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.051282 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.071616 4778 generic.go:334] "Generic (PLEG): container finished" podID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" containerID="093eea9ca1289144db233d06bdf0c20c91f488ca2e07beacc0855db06ecb65e1" exitCode=143 Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.071957 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd","Type":"ContainerDied","Data":"093eea9ca1289144db233d06bdf0c20c91f488ca2e07beacc0855db06ecb65e1"} Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.099076 4778 generic.go:334] "Generic (PLEG): container finished" podID="0d620ac9-3637-4725-8ba8-bd2573ecd345" containerID="ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e" exitCode=143 Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.099181 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"0d620ac9-3637-4725-8ba8-bd2573ecd345","Type":"ContainerDied","Data":"ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e"} Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.104299 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"233d3868-b9e1-4500-8f58-101a60f83778","Type":"ContainerDied","Data":"8d75f24abdf0feb656e183c7b84ded8b54401955b98da6b7095809e76b9559c4"} Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.104356 4778 scope.go:117] "RemoveContainer" containerID="cae214bfe79163de5e4792871271129e34120fd56d08580b29f30e4fe55dbc64" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.104514 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.206984 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfg9t\" (UniqueName: \"kubernetes.io/projected/233d3868-b9e1-4500-8f58-101a60f83778-kube-api-access-xfg9t\") pod \"233d3868-b9e1-4500-8f58-101a60f83778\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.207054 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233d3868-b9e1-4500-8f58-101a60f83778-logs\") pod \"233d3868-b9e1-4500-8f58-101a60f83778\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.207085 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-custom-prometheus-ca\") pod \"233d3868-b9e1-4500-8f58-101a60f83778\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.207098 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-combined-ca-bundle\") pod \"233d3868-b9e1-4500-8f58-101a60f83778\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.207135 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-config-data\") pod \"233d3868-b9e1-4500-8f58-101a60f83778\" (UID: \"233d3868-b9e1-4500-8f58-101a60f83778\") " Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.208881 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233d3868-b9e1-4500-8f58-101a60f83778-logs" (OuterVolumeSpecName: "logs") pod "233d3868-b9e1-4500-8f58-101a60f83778" (UID: "233d3868-b9e1-4500-8f58-101a60f83778"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.213821 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233d3868-b9e1-4500-8f58-101a60f83778-kube-api-access-xfg9t" (OuterVolumeSpecName: "kube-api-access-xfg9t") pod "233d3868-b9e1-4500-8f58-101a60f83778" (UID: "233d3868-b9e1-4500-8f58-101a60f83778"). InnerVolumeSpecName "kube-api-access-xfg9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.240510 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "233d3868-b9e1-4500-8f58-101a60f83778" (UID: "233d3868-b9e1-4500-8f58-101a60f83778"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.242483 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "233d3868-b9e1-4500-8f58-101a60f83778" (UID: "233d3868-b9e1-4500-8f58-101a60f83778"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.275788 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-config-data" (OuterVolumeSpecName: "config-data") pod "233d3868-b9e1-4500-8f58-101a60f83778" (UID: "233d3868-b9e1-4500-8f58-101a60f83778"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.285851 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2588746d-a6c5-4c07-b515-7ea2429723cb" path="/var/lib/kubelet/pods/2588746d-a6c5-4c07-b515-7ea2429723cb/volumes" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.308624 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfg9t\" (UniqueName: \"kubernetes.io/projected/233d3868-b9e1-4500-8f58-101a60f83778-kube-api-access-xfg9t\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.308655 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233d3868-b9e1-4500-8f58-101a60f83778-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.308665 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.308673 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.308684 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233d3868-b9e1-4500-8f58-101a60f83778-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.390053 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchere51a-account-delete-rb7s8"] Dec 05 16:40:59 crc kubenswrapper[4778]: W1205 16:40:59.394868 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddadd04c1_2699_4ec5_9886_972167a1fb0b.slice/crio-d4b33f4cd99edf2fe5ea91128f4761a1107ccf7fe8084506feb5c3912f11c413 WatchSource:0}: Error finding container d4b33f4cd99edf2fe5ea91128f4761a1107ccf7fe8084506feb5c3912f11c413: Status 404 returned error can't find the container with id d4b33f4cd99edf2fe5ea91128f4761a1107ccf7fe8084506feb5c3912f11c413 Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.459573 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.488985 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:40:59 crc kubenswrapper[4778]: E1205 16:40:59.564968 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c946218ff5881070cb0218aceb13162a4c809c65d77f11a88834c149b5e58d12" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:40:59 crc kubenswrapper[4778]: E1205 16:40:59.568296 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c946218ff5881070cb0218aceb13162a4c809c65d77f11a88834c149b5e58d12" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:40:59 crc kubenswrapper[4778]: E1205 16:40:59.571752 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c946218ff5881070cb0218aceb13162a4c809c65d77f11a88834c149b5e58d12" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:40:59 crc kubenswrapper[4778]: E1205 16:40:59.571808 4778 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="61de5783-faa3-466b-8121-69d6c8dcb01b" containerName="watcher-applier" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.704454 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.168:9322/\": read tcp 10.217.0.2:49582->10.217.0.168:9322: read: connection reset by peer" Dec 05 16:40:59 crc kubenswrapper[4778]: I1205 16:40:59.704455 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9322/\": read tcp 10.217.0.2:49596->10.217.0.168:9322: read: connection reset by peer" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.114223 4778 generic.go:334] "Generic (PLEG): container finished" podID="dadd04c1-2699-4ec5-9886-972167a1fb0b" containerID="f1c305ae03ed4d88c11880f3e74b788b6cc7a1dabee53c513fd1c913b3b11919" exitCode=0 Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.114286 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" event={"ID":"dadd04c1-2699-4ec5-9886-972167a1fb0b","Type":"ContainerDied","Data":"f1c305ae03ed4d88c11880f3e74b788b6cc7a1dabee53c513fd1c913b3b11919"} Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.114311 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" event={"ID":"dadd04c1-2699-4ec5-9886-972167a1fb0b","Type":"ContainerStarted","Data":"d4b33f4cd99edf2fe5ea91128f4761a1107ccf7fe8084506feb5c3912f11c413"} Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.117811 4778 generic.go:334] "Generic (PLEG): container finished" podID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" containerID="1d46478fcd519b2f1645f0afe8864e337e6bc48b4e273bfc7928c0748ea9ea3f" exitCode=0 Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.117852 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd","Type":"ContainerDied","Data":"1d46478fcd519b2f1645f0afe8864e337e6bc48b4e273bfc7928c0748ea9ea3f"} Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.188179 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.313795 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="0d620ac9-3637-4725-8ba8-bd2573ecd345" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.167:9322/\": read tcp 10.217.0.2:40414->10.217.0.167:9322: read: connection reset by peer" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.314120 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="0d620ac9-3637-4725-8ba8-bd2573ecd345" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9322/\": read tcp 10.217.0.2:40418->10.217.0.167:9322: read: connection reset by peer" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.328689 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-combined-ca-bundle\") pod \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.328846 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-config-data\") pod \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.328873 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-custom-prometheus-ca\") pod \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.328963 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-logs\") pod \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.329063 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4xfv\" (UniqueName: \"kubernetes.io/projected/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-kube-api-access-z4xfv\") pod \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\" (UID: \"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd\") " Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.329591 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-logs" (OuterVolumeSpecName: "logs") pod "070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" (UID: "070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.344236 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-kube-api-access-z4xfv" (OuterVolumeSpecName: "kube-api-access-z4xfv") pod "070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" (UID: "070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd"). InnerVolumeSpecName "kube-api-access-z4xfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.363596 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" (UID: "070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.383398 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" (UID: "070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.400721 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-config-data" (OuterVolumeSpecName: "config-data") pod "070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" (UID: "070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.430532 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.430563 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.430574 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.430585 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4xfv\" (UniqueName: \"kubernetes.io/projected/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-kube-api-access-z4xfv\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.430597 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.669159 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.836662 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d620ac9-3637-4725-8ba8-bd2573ecd345-logs\") pod \"0d620ac9-3637-4725-8ba8-bd2573ecd345\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.836749 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-custom-prometheus-ca\") pod \"0d620ac9-3637-4725-8ba8-bd2573ecd345\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.836792 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-combined-ca-bundle\") pod \"0d620ac9-3637-4725-8ba8-bd2573ecd345\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.836830 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k454g\" (UniqueName: \"kubernetes.io/projected/0d620ac9-3637-4725-8ba8-bd2573ecd345-kube-api-access-k454g\") pod \"0d620ac9-3637-4725-8ba8-bd2573ecd345\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.836863 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-config-data\") pod \"0d620ac9-3637-4725-8ba8-bd2573ecd345\" (UID: \"0d620ac9-3637-4725-8ba8-bd2573ecd345\") " Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.837588 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d620ac9-3637-4725-8ba8-bd2573ecd345-logs" (OuterVolumeSpecName: "logs") pod "0d620ac9-3637-4725-8ba8-bd2573ecd345" (UID: "0d620ac9-3637-4725-8ba8-bd2573ecd345"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.851714 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d620ac9-3637-4725-8ba8-bd2573ecd345-kube-api-access-k454g" (OuterVolumeSpecName: "kube-api-access-k454g") pod "0d620ac9-3637-4725-8ba8-bd2573ecd345" (UID: "0d620ac9-3637-4725-8ba8-bd2573ecd345"). InnerVolumeSpecName "kube-api-access-k454g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.858522 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d620ac9-3637-4725-8ba8-bd2573ecd345" (UID: "0d620ac9-3637-4725-8ba8-bd2573ecd345"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.859887 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "0d620ac9-3637-4725-8ba8-bd2573ecd345" (UID: "0d620ac9-3637-4725-8ba8-bd2573ecd345"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.887682 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-config-data" (OuterVolumeSpecName: "config-data") pod "0d620ac9-3637-4725-8ba8-bd2573ecd345" (UID: "0d620ac9-3637-4725-8ba8-bd2573ecd345"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.938672 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.938707 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k454g\" (UniqueName: \"kubernetes.io/projected/0d620ac9-3637-4725-8ba8-bd2573ecd345-kube-api-access-k454g\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.938722 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.938736 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d620ac9-3637-4725-8ba8-bd2573ecd345-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:00 crc kubenswrapper[4778]: I1205 16:41:00.938749 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0d620ac9-3637-4725-8ba8-bd2573ecd345-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.127999 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd","Type":"ContainerDied","Data":"6ed5ae9c071462facac76dc86718bbe4f9e382f7e68bcb6f3907ab4ddd32fccb"} Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.128058 4778 scope.go:117] "RemoveContainer" containerID="1d46478fcd519b2f1645f0afe8864e337e6bc48b4e273bfc7928c0748ea9ea3f" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.128207 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.134181 4778 generic.go:334] "Generic (PLEG): container finished" podID="0d620ac9-3637-4725-8ba8-bd2573ecd345" containerID="59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40" exitCode=0 Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.134233 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"0d620ac9-3637-4725-8ba8-bd2573ecd345","Type":"ContainerDied","Data":"59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40"} Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.134288 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"0d620ac9-3637-4725-8ba8-bd2573ecd345","Type":"ContainerDied","Data":"0b4ab3b52ba86aa45397b582429a9cab312e24abf6f1cf4772a0290bd32de442"} Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.134462 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.163863 4778 scope.go:117] "RemoveContainer" containerID="093eea9ca1289144db233d06bdf0c20c91f488ca2e07beacc0855db06ecb65e1" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.167880 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.185692 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.196546 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.198984 4778 scope.go:117] "RemoveContainer" containerID="59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.207060 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.218468 4778 scope.go:117] "RemoveContainer" containerID="ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.237209 4778 scope.go:117] "RemoveContainer" containerID="59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40" Dec 05 16:41:01 crc kubenswrapper[4778]: E1205 16:41:01.238064 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40\": container with ID starting with 59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40 not found: ID does not exist" containerID="59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.238104 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40"} err="failed to get container status \"59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40\": rpc error: code = NotFound desc = could not find container \"59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40\": container with ID starting with 59d9f787ac6c543d52ae68de3f35faf4e5b055d4b4ed206a104daf93d8d32c40 not found: ID does not exist" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.238127 4778 scope.go:117] "RemoveContainer" containerID="ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e" Dec 05 16:41:01 crc kubenswrapper[4778]: E1205 16:41:01.241318 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e\": container with ID starting with ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e not found: ID does not exist" containerID="ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.241384 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e"} err="failed to get container status \"ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e\": rpc error: code = NotFound desc = could not find container \"ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e\": container with ID starting with ae64e6c99301f3379318e8b507ed30e124851801a30f132ca033ba945fa38e6e not found: ID does not exist" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.263697 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" path="/var/lib/kubelet/pods/070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd/volumes" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.265518 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d620ac9-3637-4725-8ba8-bd2573ecd345" path="/var/lib/kubelet/pods/0d620ac9-3637-4725-8ba8-bd2573ecd345/volumes" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.266270 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233d3868-b9e1-4500-8f58-101a60f83778" path="/var/lib/kubelet/pods/233d3868-b9e1-4500-8f58-101a60f83778/volumes" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.462155 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.546820 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadd04c1-2699-4ec5-9886-972167a1fb0b-operator-scripts\") pod \"dadd04c1-2699-4ec5-9886-972167a1fb0b\" (UID: \"dadd04c1-2699-4ec5-9886-972167a1fb0b\") " Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.546884 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bsn2\" (UniqueName: \"kubernetes.io/projected/dadd04c1-2699-4ec5-9886-972167a1fb0b-kube-api-access-7bsn2\") pod \"dadd04c1-2699-4ec5-9886-972167a1fb0b\" (UID: \"dadd04c1-2699-4ec5-9886-972167a1fb0b\") " Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.547299 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadd04c1-2699-4ec5-9886-972167a1fb0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dadd04c1-2699-4ec5-9886-972167a1fb0b" (UID: "dadd04c1-2699-4ec5-9886-972167a1fb0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.549976 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadd04c1-2699-4ec5-9886-972167a1fb0b-kube-api-access-7bsn2" (OuterVolumeSpecName: "kube-api-access-7bsn2") pod "dadd04c1-2699-4ec5-9886-972167a1fb0b" (UID: "dadd04c1-2699-4ec5-9886-972167a1fb0b"). InnerVolumeSpecName "kube-api-access-7bsn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.649423 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadd04c1-2699-4ec5-9886-972167a1fb0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:01 crc kubenswrapper[4778]: I1205 16:41:01.649475 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bsn2\" (UniqueName: \"kubernetes.io/projected/dadd04c1-2699-4ec5-9886-972167a1fb0b-kube-api-access-7bsn2\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:02 crc kubenswrapper[4778]: I1205 16:41:02.146702 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" Dec 05 16:41:02 crc kubenswrapper[4778]: I1205 16:41:02.146700 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchere51a-account-delete-rb7s8" event={"ID":"dadd04c1-2699-4ec5-9886-972167a1fb0b","Type":"ContainerDied","Data":"d4b33f4cd99edf2fe5ea91128f4761a1107ccf7fe8084506feb5c3912f11c413"} Dec 05 16:41:02 crc kubenswrapper[4778]: I1205 16:41:02.146820 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4b33f4cd99edf2fe5ea91128f4761a1107ccf7fe8084506feb5c3912f11c413" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.495967 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-k8k7x"] Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.505331 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-k8k7x"] Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.550051 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watchere51a-account-delete-rb7s8"] Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.558747 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watchere51a-account-delete-rb7s8"] Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.569197 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq"] Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.591836 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-e51a-account-create-update-7vwlq"] Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.606427 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-vk6n7"] Dec 05 16:41:03 crc kubenswrapper[4778]: E1205 16:41:03.606876 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d620ac9-3637-4725-8ba8-bd2573ecd345" containerName="watcher-api" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.606906 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d620ac9-3637-4725-8ba8-bd2573ecd345" containerName="watcher-api" Dec 05 16:41:03 crc kubenswrapper[4778]: E1205 16:41:03.606927 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" containerName="watcher-api" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.606936 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" containerName="watcher-api" Dec 05 16:41:03 crc kubenswrapper[4778]: E1205 16:41:03.606947 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.606955 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: E1205 16:41:03.606967 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" containerName="watcher-kuttl-api-log" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.606975 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" containerName="watcher-kuttl-api-log" Dec 05 16:41:03 crc kubenswrapper[4778]: E1205 16:41:03.607000 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607007 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: E1205 16:41:03.607025 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadd04c1-2699-4ec5-9886-972167a1fb0b" containerName="mariadb-account-delete" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607032 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadd04c1-2699-4ec5-9886-972167a1fb0b" containerName="mariadb-account-delete" Dec 05 16:41:03 crc kubenswrapper[4778]: E1205 16:41:03.607049 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607057 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: E1205 16:41:03.607070 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d620ac9-3637-4725-8ba8-bd2573ecd345" containerName="watcher-kuttl-api-log" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607078 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d620ac9-3637-4725-8ba8-bd2573ecd345" containerName="watcher-kuttl-api-log" Dec 05 16:41:03 crc kubenswrapper[4778]: E1205 16:41:03.607094 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607101 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607289 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d620ac9-3637-4725-8ba8-bd2573ecd345" containerName="watcher-api" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607306 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" containerName="watcher-kuttl-api-log" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607321 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d620ac9-3637-4725-8ba8-bd2573ecd345" containerName="watcher-kuttl-api-log" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607333 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607344 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607355 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607389 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="070175cf-65a4-4bbb-a5c0-1c3fe31fb2fd" containerName="watcher-api" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607397 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.607409 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dadd04c1-2699-4ec5-9886-972167a1fb0b" containerName="mariadb-account-delete" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.608134 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-vk6n7" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.616792 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-vk6n7"] Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.679908 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd6j4\" (UniqueName: \"kubernetes.io/projected/2093d628-dfe1-4bdf-bc15-f148fef55c4e-kube-api-access-hd6j4\") pod \"watcher-db-create-vk6n7\" (UID: \"2093d628-dfe1-4bdf-bc15-f148fef55c4e\") " pod="watcher-kuttl-default/watcher-db-create-vk6n7" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.679986 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2093d628-dfe1-4bdf-bc15-f148fef55c4e-operator-scripts\") pod \"watcher-db-create-vk6n7\" (UID: \"2093d628-dfe1-4bdf-bc15-f148fef55c4e\") " pod="watcher-kuttl-default/watcher-db-create-vk6n7" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.712972 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2"] Dec 05 16:41:03 crc kubenswrapper[4778]: E1205 16:41:03.713409 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.713431 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: E1205 16:41:03.713454 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.713463 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.713621 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.714347 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.718459 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2"] Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.718729 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.781035 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2093d628-dfe1-4bdf-bc15-f148fef55c4e-operator-scripts\") pod \"watcher-db-create-vk6n7\" (UID: \"2093d628-dfe1-4bdf-bc15-f148fef55c4e\") " pod="watcher-kuttl-default/watcher-db-create-vk6n7" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.781123 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/802eb834-b4e6-4946-abbe-636096f213c7-operator-scripts\") pod \"watcher-2ff6-account-create-update-bmvf2\" (UID: \"802eb834-b4e6-4946-abbe-636096f213c7\") " pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.781160 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt7l9\" (UniqueName: \"kubernetes.io/projected/802eb834-b4e6-4946-abbe-636096f213c7-kube-api-access-jt7l9\") pod \"watcher-2ff6-account-create-update-bmvf2\" (UID: \"802eb834-b4e6-4946-abbe-636096f213c7\") " pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.781208 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd6j4\" (UniqueName: \"kubernetes.io/projected/2093d628-dfe1-4bdf-bc15-f148fef55c4e-kube-api-access-hd6j4\") pod \"watcher-db-create-vk6n7\" (UID: \"2093d628-dfe1-4bdf-bc15-f148fef55c4e\") " pod="watcher-kuttl-default/watcher-db-create-vk6n7" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.782004 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2093d628-dfe1-4bdf-bc15-f148fef55c4e-operator-scripts\") pod \"watcher-db-create-vk6n7\" (UID: \"2093d628-dfe1-4bdf-bc15-f148fef55c4e\") " pod="watcher-kuttl-default/watcher-db-create-vk6n7" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.801936 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd6j4\" (UniqueName: \"kubernetes.io/projected/2093d628-dfe1-4bdf-bc15-f148fef55c4e-kube-api-access-hd6j4\") pod \"watcher-db-create-vk6n7\" (UID: \"2093d628-dfe1-4bdf-bc15-f148fef55c4e\") " pod="watcher-kuttl-default/watcher-db-create-vk6n7" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.883412 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/802eb834-b4e6-4946-abbe-636096f213c7-operator-scripts\") pod \"watcher-2ff6-account-create-update-bmvf2\" (UID: \"802eb834-b4e6-4946-abbe-636096f213c7\") " pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.883667 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt7l9\" (UniqueName: \"kubernetes.io/projected/802eb834-b4e6-4946-abbe-636096f213c7-kube-api-access-jt7l9\") pod \"watcher-2ff6-account-create-update-bmvf2\" (UID: \"802eb834-b4e6-4946-abbe-636096f213c7\") " pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.884153 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/802eb834-b4e6-4946-abbe-636096f213c7-operator-scripts\") pod \"watcher-2ff6-account-create-update-bmvf2\" (UID: \"802eb834-b4e6-4946-abbe-636096f213c7\") " pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.912082 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt7l9\" (UniqueName: \"kubernetes.io/projected/802eb834-b4e6-4946-abbe-636096f213c7-kube-api-access-jt7l9\") pod \"watcher-2ff6-account-create-update-bmvf2\" (UID: \"802eb834-b4e6-4946-abbe-636096f213c7\") " pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" Dec 05 16:41:03 crc kubenswrapper[4778]: I1205 16:41:03.990804 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-vk6n7" Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.049505 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.164529 4778 generic.go:334] "Generic (PLEG): container finished" podID="61de5783-faa3-466b-8121-69d6c8dcb01b" containerID="c946218ff5881070cb0218aceb13162a4c809c65d77f11a88834c149b5e58d12" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.164798 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"61de5783-faa3-466b-8121-69d6c8dcb01b","Type":"ContainerDied","Data":"c946218ff5881070cb0218aceb13162a4c809c65d77f11a88834c149b5e58d12"} Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.164864 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"61de5783-faa3-466b-8121-69d6c8dcb01b","Type":"ContainerDied","Data":"59df40978a926405a7dfa5f506a9c0ce6e82d32a90e3a766ea05ef30f4f5938f"} Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.164877 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59df40978a926405a7dfa5f506a9c0ce6e82d32a90e3a766ea05ef30f4f5938f" Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.193794 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.293039 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de5783-faa3-466b-8121-69d6c8dcb01b-combined-ca-bundle\") pod \"61de5783-faa3-466b-8121-69d6c8dcb01b\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.293117 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de5783-faa3-466b-8121-69d6c8dcb01b-config-data\") pod \"61de5783-faa3-466b-8121-69d6c8dcb01b\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.293184 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn7jv\" (UniqueName: \"kubernetes.io/projected/61de5783-faa3-466b-8121-69d6c8dcb01b-kube-api-access-gn7jv\") pod \"61de5783-faa3-466b-8121-69d6c8dcb01b\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.293345 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61de5783-faa3-466b-8121-69d6c8dcb01b-logs\") pod \"61de5783-faa3-466b-8121-69d6c8dcb01b\" (UID: \"61de5783-faa3-466b-8121-69d6c8dcb01b\") " Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.294209 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61de5783-faa3-466b-8121-69d6c8dcb01b-logs" (OuterVolumeSpecName: "logs") pod "61de5783-faa3-466b-8121-69d6c8dcb01b" (UID: "61de5783-faa3-466b-8121-69d6c8dcb01b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.300755 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61de5783-faa3-466b-8121-69d6c8dcb01b-kube-api-access-gn7jv" (OuterVolumeSpecName: "kube-api-access-gn7jv") pod "61de5783-faa3-466b-8121-69d6c8dcb01b" (UID: "61de5783-faa3-466b-8121-69d6c8dcb01b"). InnerVolumeSpecName "kube-api-access-gn7jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.335545 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61de5783-faa3-466b-8121-69d6c8dcb01b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61de5783-faa3-466b-8121-69d6c8dcb01b" (UID: "61de5783-faa3-466b-8121-69d6c8dcb01b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.374785 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-vk6n7"] Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.395188 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de5783-faa3-466b-8121-69d6c8dcb01b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.395216 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn7jv\" (UniqueName: \"kubernetes.io/projected/61de5783-faa3-466b-8121-69d6c8dcb01b-kube-api-access-gn7jv\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.395225 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61de5783-faa3-466b-8121-69d6c8dcb01b-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.419480 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61de5783-faa3-466b-8121-69d6c8dcb01b-config-data" (OuterVolumeSpecName: "config-data") pod "61de5783-faa3-466b-8121-69d6c8dcb01b" (UID: "61de5783-faa3-466b-8121-69d6c8dcb01b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.497463 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de5783-faa3-466b-8121-69d6c8dcb01b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4778]: I1205 16:41:04.651781 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2"] Dec 05 16:41:04 crc kubenswrapper[4778]: W1205 16:41:04.651827 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod802eb834_b4e6_4946_abbe_636096f213c7.slice/crio-fcd6575af550c9145304e5f66f1584e5b8fbf481355cde1ca07a5431e4731847 WatchSource:0}: Error finding container fcd6575af550c9145304e5f66f1584e5b8fbf481355cde1ca07a5431e4731847: Status 404 returned error can't find the container with id fcd6575af550c9145304e5f66f1584e5b8fbf481355cde1ca07a5431e4731847 Dec 05 16:41:05 crc kubenswrapper[4778]: I1205 16:41:05.178013 4778 generic.go:334] "Generic (PLEG): container finished" podID="2093d628-dfe1-4bdf-bc15-f148fef55c4e" containerID="e16b064014592c74a716afbbbe8b0ead37d7ab5ec1cd747f13099642d5ee10ca" exitCode=0 Dec 05 16:41:05 crc kubenswrapper[4778]: I1205 16:41:05.178226 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-vk6n7" event={"ID":"2093d628-dfe1-4bdf-bc15-f148fef55c4e","Type":"ContainerDied","Data":"e16b064014592c74a716afbbbe8b0ead37d7ab5ec1cd747f13099642d5ee10ca"} Dec 05 16:41:05 crc kubenswrapper[4778]: I1205 16:41:05.178461 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-vk6n7" event={"ID":"2093d628-dfe1-4bdf-bc15-f148fef55c4e","Type":"ContainerStarted","Data":"ba52974d3c5727900b27e26b98b7d079a17b0480e17fc365f85a5181a14956c9"} Dec 05 16:41:05 crc kubenswrapper[4778]: I1205 16:41:05.184158 4778 generic.go:334] "Generic (PLEG): container finished" podID="802eb834-b4e6-4946-abbe-636096f213c7" containerID="e33e7bc8c843e95a0c050e337c950e3e6bb8ef545948afbcc7484cb838f3b56b" exitCode=0 Dec 05 16:41:05 crc kubenswrapper[4778]: I1205 16:41:05.184226 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:05 crc kubenswrapper[4778]: I1205 16:41:05.184582 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" event={"ID":"802eb834-b4e6-4946-abbe-636096f213c7","Type":"ContainerDied","Data":"e33e7bc8c843e95a0c050e337c950e3e6bb8ef545948afbcc7484cb838f3b56b"} Dec 05 16:41:05 crc kubenswrapper[4778]: I1205 16:41:05.184640 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" event={"ID":"802eb834-b4e6-4946-abbe-636096f213c7","Type":"ContainerStarted","Data":"fcd6575af550c9145304e5f66f1584e5b8fbf481355cde1ca07a5431e4731847"} Dec 05 16:41:05 crc kubenswrapper[4778]: I1205 16:41:05.246464 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:41:05 crc kubenswrapper[4778]: I1205 16:41:05.258894 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3002d1-ef2d-4ae6-b699-149a3b7456cd" path="/var/lib/kubelet/pods/3e3002d1-ef2d-4ae6-b699-149a3b7456cd/volumes" Dec 05 16:41:05 crc kubenswrapper[4778]: I1205 16:41:05.259588 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44210ac8-050b-4b56-b2f5-3afe7deae253" path="/var/lib/kubelet/pods/44210ac8-050b-4b56-b2f5-3afe7deae253/volumes" Dec 05 16:41:05 crc kubenswrapper[4778]: I1205 16:41:05.260176 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dadd04c1-2699-4ec5-9886-972167a1fb0b" path="/var/lib/kubelet/pods/dadd04c1-2699-4ec5-9886-972167a1fb0b/volumes" Dec 05 16:41:05 crc kubenswrapper[4778]: I1205 16:41:05.260671 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.605798 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-vk6n7" Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.612210 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.731432 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd6j4\" (UniqueName: \"kubernetes.io/projected/2093d628-dfe1-4bdf-bc15-f148fef55c4e-kube-api-access-hd6j4\") pod \"2093d628-dfe1-4bdf-bc15-f148fef55c4e\" (UID: \"2093d628-dfe1-4bdf-bc15-f148fef55c4e\") " Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.731532 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2093d628-dfe1-4bdf-bc15-f148fef55c4e-operator-scripts\") pod \"2093d628-dfe1-4bdf-bc15-f148fef55c4e\" (UID: \"2093d628-dfe1-4bdf-bc15-f148fef55c4e\") " Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.731565 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/802eb834-b4e6-4946-abbe-636096f213c7-operator-scripts\") pod \"802eb834-b4e6-4946-abbe-636096f213c7\" (UID: \"802eb834-b4e6-4946-abbe-636096f213c7\") " Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.731647 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt7l9\" (UniqueName: \"kubernetes.io/projected/802eb834-b4e6-4946-abbe-636096f213c7-kube-api-access-jt7l9\") pod \"802eb834-b4e6-4946-abbe-636096f213c7\" (UID: \"802eb834-b4e6-4946-abbe-636096f213c7\") " Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.731918 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2093d628-dfe1-4bdf-bc15-f148fef55c4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2093d628-dfe1-4bdf-bc15-f148fef55c4e" (UID: "2093d628-dfe1-4bdf-bc15-f148fef55c4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.732119 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2093d628-dfe1-4bdf-bc15-f148fef55c4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.732226 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/802eb834-b4e6-4946-abbe-636096f213c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "802eb834-b4e6-4946-abbe-636096f213c7" (UID: "802eb834-b4e6-4946-abbe-636096f213c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.737649 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2093d628-dfe1-4bdf-bc15-f148fef55c4e-kube-api-access-hd6j4" (OuterVolumeSpecName: "kube-api-access-hd6j4") pod "2093d628-dfe1-4bdf-bc15-f148fef55c4e" (UID: "2093d628-dfe1-4bdf-bc15-f148fef55c4e"). InnerVolumeSpecName "kube-api-access-hd6j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.737830 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/802eb834-b4e6-4946-abbe-636096f213c7-kube-api-access-jt7l9" (OuterVolumeSpecName: "kube-api-access-jt7l9") pod "802eb834-b4e6-4946-abbe-636096f213c7" (UID: "802eb834-b4e6-4946-abbe-636096f213c7"). InnerVolumeSpecName "kube-api-access-jt7l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.833783 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd6j4\" (UniqueName: \"kubernetes.io/projected/2093d628-dfe1-4bdf-bc15-f148fef55c4e-kube-api-access-hd6j4\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.833813 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/802eb834-b4e6-4946-abbe-636096f213c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:06 crc kubenswrapper[4778]: I1205 16:41:06.833821 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt7l9\" (UniqueName: \"kubernetes.io/projected/802eb834-b4e6-4946-abbe-636096f213c7-kube-api-access-jt7l9\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:07 crc kubenswrapper[4778]: I1205 16:41:07.215440 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" event={"ID":"802eb834-b4e6-4946-abbe-636096f213c7","Type":"ContainerDied","Data":"fcd6575af550c9145304e5f66f1584e5b8fbf481355cde1ca07a5431e4731847"} Dec 05 16:41:07 crc kubenswrapper[4778]: I1205 16:41:07.215755 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcd6575af550c9145304e5f66f1584e5b8fbf481355cde1ca07a5431e4731847" Dec 05 16:41:07 crc kubenswrapper[4778]: I1205 16:41:07.215473 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2" Dec 05 16:41:07 crc kubenswrapper[4778]: I1205 16:41:07.217424 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-vk6n7" event={"ID":"2093d628-dfe1-4bdf-bc15-f148fef55c4e","Type":"ContainerDied","Data":"ba52974d3c5727900b27e26b98b7d079a17b0480e17fc365f85a5181a14956c9"} Dec 05 16:41:07 crc kubenswrapper[4778]: I1205 16:41:07.217452 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba52974d3c5727900b27e26b98b7d079a17b0480e17fc365f85a5181a14956c9" Dec 05 16:41:07 crc kubenswrapper[4778]: I1205 16:41:07.217504 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-vk6n7" Dec 05 16:41:07 crc kubenswrapper[4778]: I1205 16:41:07.261233 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61de5783-faa3-466b-8121-69d6c8dcb01b" path="/var/lib/kubelet/pods/61de5783-faa3-466b-8121-69d6c8dcb01b/volumes" Dec 05 16:41:08 crc kubenswrapper[4778]: I1205 16:41:08.956895 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd"] Dec 05 16:41:08 crc kubenswrapper[4778]: E1205 16:41:08.957235 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61de5783-faa3-466b-8121-69d6c8dcb01b" containerName="watcher-applier" Dec 05 16:41:08 crc kubenswrapper[4778]: I1205 16:41:08.957248 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="61de5783-faa3-466b-8121-69d6c8dcb01b" containerName="watcher-applier" Dec 05 16:41:08 crc kubenswrapper[4778]: E1205 16:41:08.957265 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="802eb834-b4e6-4946-abbe-636096f213c7" containerName="mariadb-account-create-update" Dec 05 16:41:08 crc kubenswrapper[4778]: I1205 16:41:08.957271 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="802eb834-b4e6-4946-abbe-636096f213c7" containerName="mariadb-account-create-update" Dec 05 16:41:08 crc kubenswrapper[4778]: E1205 16:41:08.957287 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2093d628-dfe1-4bdf-bc15-f148fef55c4e" containerName="mariadb-database-create" Dec 05 16:41:08 crc kubenswrapper[4778]: I1205 16:41:08.957296 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2093d628-dfe1-4bdf-bc15-f148fef55c4e" containerName="mariadb-database-create" Dec 05 16:41:08 crc kubenswrapper[4778]: I1205 16:41:08.957474 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="61de5783-faa3-466b-8121-69d6c8dcb01b" containerName="watcher-applier" Dec 05 16:41:08 crc kubenswrapper[4778]: I1205 16:41:08.957526 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2093d628-dfe1-4bdf-bc15-f148fef55c4e" containerName="mariadb-database-create" Dec 05 16:41:08 crc kubenswrapper[4778]: I1205 16:41:08.957539 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="802eb834-b4e6-4946-abbe-636096f213c7" containerName="mariadb-account-create-update" Dec 05 16:41:08 crc kubenswrapper[4778]: I1205 16:41:08.957558 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="233d3868-b9e1-4500-8f58-101a60f83778" containerName="watcher-decision-engine" Dec 05 16:41:08 crc kubenswrapper[4778]: I1205 16:41:08.958128 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:08 crc kubenswrapper[4778]: I1205 16:41:08.962142 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-svrfk" Dec 05 16:41:08 crc kubenswrapper[4778]: I1205 16:41:08.962322 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 16:41:08 crc kubenswrapper[4778]: I1205 16:41:08.972085 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd"] Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.066689 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvgtn\" (UniqueName: \"kubernetes.io/projected/1695ca72-1c4c-496a-85cf-481567869c56-kube-api-access-qvgtn\") pod \"watcher-kuttl-db-sync-lr4hd\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.067053 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-db-sync-config-data\") pod \"watcher-kuttl-db-sync-lr4hd\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.067173 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-lr4hd\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.067318 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-config-data\") pod \"watcher-kuttl-db-sync-lr4hd\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.169125 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvgtn\" (UniqueName: \"kubernetes.io/projected/1695ca72-1c4c-496a-85cf-481567869c56-kube-api-access-qvgtn\") pod \"watcher-kuttl-db-sync-lr4hd\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.169442 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-db-sync-config-data\") pod \"watcher-kuttl-db-sync-lr4hd\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.169566 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-lr4hd\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.169657 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-config-data\") pod \"watcher-kuttl-db-sync-lr4hd\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.181573 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-config-data\") pod \"watcher-kuttl-db-sync-lr4hd\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.181582 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-lr4hd\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.183698 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvgtn\" (UniqueName: \"kubernetes.io/projected/1695ca72-1c4c-496a-85cf-481567869c56-kube-api-access-qvgtn\") pod \"watcher-kuttl-db-sync-lr4hd\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.188140 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-db-sync-config-data\") pod \"watcher-kuttl-db-sync-lr4hd\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.279143 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:09 crc kubenswrapper[4778]: I1205 16:41:09.942186 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd"] Dec 05 16:41:10 crc kubenswrapper[4778]: I1205 16:41:10.243193 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" event={"ID":"1695ca72-1c4c-496a-85cf-481567869c56","Type":"ContainerStarted","Data":"ee2d541ed224a461dd465fa6b39b67f80bfc149b2cbdd4c5e99241b3d27db58c"} Dec 05 16:41:10 crc kubenswrapper[4778]: I1205 16:41:10.244552 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" event={"ID":"1695ca72-1c4c-496a-85cf-481567869c56","Type":"ContainerStarted","Data":"0d964d18a992696df12080bc05dd996a61e19c9e8d851174ec7b9fb4d28d7c67"} Dec 05 16:41:10 crc kubenswrapper[4778]: I1205 16:41:10.264616 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" podStartSLOduration=2.264590103 podStartE2EDuration="2.264590103s" podCreationTimestamp="2025-12-05 16:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:41:10.254654552 +0000 UTC m=+2757.358450942" watchObservedRunningTime="2025-12-05 16:41:10.264590103 +0000 UTC m=+2757.368386503" Dec 05 16:41:13 crc kubenswrapper[4778]: I1205 16:41:13.263502 4778 generic.go:334] "Generic (PLEG): container finished" podID="1695ca72-1c4c-496a-85cf-481567869c56" containerID="ee2d541ed224a461dd465fa6b39b67f80bfc149b2cbdd4c5e99241b3d27db58c" exitCode=0 Dec 05 16:41:13 crc kubenswrapper[4778]: I1205 16:41:13.263591 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" event={"ID":"1695ca72-1c4c-496a-85cf-481567869c56","Type":"ContainerDied","Data":"ee2d541ed224a461dd465fa6b39b67f80bfc149b2cbdd4c5e99241b3d27db58c"} Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.722128 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.755529 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-db-sync-config-data\") pod \"1695ca72-1c4c-496a-85cf-481567869c56\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.755593 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-combined-ca-bundle\") pod \"1695ca72-1c4c-496a-85cf-481567869c56\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.755718 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-config-data\") pod \"1695ca72-1c4c-496a-85cf-481567869c56\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.755766 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvgtn\" (UniqueName: \"kubernetes.io/projected/1695ca72-1c4c-496a-85cf-481567869c56-kube-api-access-qvgtn\") pod \"1695ca72-1c4c-496a-85cf-481567869c56\" (UID: \"1695ca72-1c4c-496a-85cf-481567869c56\") " Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.761569 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1695ca72-1c4c-496a-85cf-481567869c56-kube-api-access-qvgtn" (OuterVolumeSpecName: "kube-api-access-qvgtn") pod "1695ca72-1c4c-496a-85cf-481567869c56" (UID: "1695ca72-1c4c-496a-85cf-481567869c56"). InnerVolumeSpecName "kube-api-access-qvgtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.761820 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1695ca72-1c4c-496a-85cf-481567869c56" (UID: "1695ca72-1c4c-496a-85cf-481567869c56"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.781589 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1695ca72-1c4c-496a-85cf-481567869c56" (UID: "1695ca72-1c4c-496a-85cf-481567869c56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.805651 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-config-data" (OuterVolumeSpecName: "config-data") pod "1695ca72-1c4c-496a-85cf-481567869c56" (UID: "1695ca72-1c4c-496a-85cf-481567869c56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.858057 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvgtn\" (UniqueName: \"kubernetes.io/projected/1695ca72-1c4c-496a-85cf-481567869c56-kube-api-access-qvgtn\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.858085 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.858094 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:14 crc kubenswrapper[4778]: I1205 16:41:14.858102 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1695ca72-1c4c-496a-85cf-481567869c56-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.283837 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" event={"ID":"1695ca72-1c4c-496a-85cf-481567869c56","Type":"ContainerDied","Data":"0d964d18a992696df12080bc05dd996a61e19c9e8d851174ec7b9fb4d28d7c67"} Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.284443 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d964d18a992696df12080bc05dd996a61e19c9e8d851174ec7b9fb4d28d7c67" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.284415 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.549203 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:41:15 crc kubenswrapper[4778]: E1205 16:41:15.549577 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1695ca72-1c4c-496a-85cf-481567869c56" containerName="watcher-kuttl-db-sync" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.549595 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1695ca72-1c4c-496a-85cf-481567869c56" containerName="watcher-kuttl-db-sync" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.549787 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1695ca72-1c4c-496a-85cf-481567869c56" containerName="watcher-kuttl-db-sync" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.550317 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.553566 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.553961 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-svrfk" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.563667 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.571151 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.573386 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.574428 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.581223 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.671304 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.671349 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbvvb\" (UniqueName: \"kubernetes.io/projected/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-kube-api-access-qbvvb\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.671416 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.671458 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.671488 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.671510 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.671526 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.671541 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l566\" (UniqueName: \"kubernetes.io/projected/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-kube-api-access-8l566\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.671557 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.671580 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-logs\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.675022 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.676185 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.678333 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.698378 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.772590 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-logs\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.772655 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e08d7c0-3b89-4afa-804d-e6f87469a631-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.772699 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w56v\" (UniqueName: \"kubernetes.io/projected/2e08d7c0-3b89-4afa-804d-e6f87469a631-kube-api-access-6w56v\") pod \"watcher-kuttl-applier-0\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.772742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.772769 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbvvb\" (UniqueName: \"kubernetes.io/projected/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-kube-api-access-qbvvb\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.772791 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e08d7c0-3b89-4afa-804d-e6f87469a631-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.772836 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.772893 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.772920 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e08d7c0-3b89-4afa-804d-e6f87469a631-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.772951 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.772979 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.772999 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.773022 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l566\" (UniqueName: \"kubernetes.io/projected/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-kube-api-access-8l566\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.773042 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.774654 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.774938 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-logs\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.782535 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.783430 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.784040 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.784673 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.790077 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.803105 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.815466 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbvvb\" (UniqueName: \"kubernetes.io/projected/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-kube-api-access-qbvvb\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.815847 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l566\" (UniqueName: \"kubernetes.io/projected/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-kube-api-access-8l566\") pod \"watcher-kuttl-api-0\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.874168 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e08d7c0-3b89-4afa-804d-e6f87469a631-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.874269 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e08d7c0-3b89-4afa-804d-e6f87469a631-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.874316 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e08d7c0-3b89-4afa-804d-e6f87469a631-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.874351 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w56v\" (UniqueName: \"kubernetes.io/projected/2e08d7c0-3b89-4afa-804d-e6f87469a631-kube-api-access-6w56v\") pod \"watcher-kuttl-applier-0\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.874963 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e08d7c0-3b89-4afa-804d-e6f87469a631-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.882938 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e08d7c0-3b89-4afa-804d-e6f87469a631-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.900037 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e08d7c0-3b89-4afa-804d-e6f87469a631-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.900873 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w56v\" (UniqueName: \"kubernetes.io/projected/2e08d7c0-3b89-4afa-804d-e6f87469a631-kube-api-access-6w56v\") pod \"watcher-kuttl-applier-0\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.936283 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.944766 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:15 crc kubenswrapper[4778]: I1205 16:41:15.992358 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:16 crc kubenswrapper[4778]: I1205 16:41:16.479298 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:41:16 crc kubenswrapper[4778]: I1205 16:41:16.576937 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:41:16 crc kubenswrapper[4778]: W1205 16:41:16.581910 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e08d7c0_3b89_4afa_804d_e6f87469a631.slice/crio-b3519dc79fa6095c03c7ffca5d19ef8aa2492e0eda79e1fe4af3331f07f52063 WatchSource:0}: Error finding container b3519dc79fa6095c03c7ffca5d19ef8aa2492e0eda79e1fe4af3331f07f52063: Status 404 returned error can't find the container with id b3519dc79fa6095c03c7ffca5d19ef8aa2492e0eda79e1fe4af3331f07f52063 Dec 05 16:41:16 crc kubenswrapper[4778]: I1205 16:41:16.583119 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:41:17 crc kubenswrapper[4778]: I1205 16:41:17.310601 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"2e08d7c0-3b89-4afa-804d-e6f87469a631","Type":"ContainerStarted","Data":"ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d"} Dec 05 16:41:17 crc kubenswrapper[4778]: I1205 16:41:17.310896 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"2e08d7c0-3b89-4afa-804d-e6f87469a631","Type":"ContainerStarted","Data":"b3519dc79fa6095c03c7ffca5d19ef8aa2492e0eda79e1fe4af3331f07f52063"} Dec 05 16:41:17 crc kubenswrapper[4778]: I1205 16:41:17.312454 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerStarted","Data":"aa19eb31daa2f88c09cfe33cbca6449ef78c8caa487fb711ebf4ef750f876258"} Dec 05 16:41:17 crc kubenswrapper[4778]: I1205 16:41:17.312489 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerStarted","Data":"82026988f9bcf75fb3e277bc272bbc0deae5d91fc29f4ac64fb039174e39ee44"} Dec 05 16:41:17 crc kubenswrapper[4778]: I1205 16:41:17.316576 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5de6b2a4-6604-409e-8bd2-c74eb85ae51e","Type":"ContainerStarted","Data":"486e83d28cd72050c59cf79c3b4db6ba28ce504ec0aa869f3857a24f96a5ab66"} Dec 05 16:41:17 crc kubenswrapper[4778]: I1205 16:41:17.316613 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5de6b2a4-6604-409e-8bd2-c74eb85ae51e","Type":"ContainerStarted","Data":"dcb573b79dfd62158fb8b8022f0559878e8c84bc7636832b984ad3484b8db717"} Dec 05 16:41:17 crc kubenswrapper[4778]: I1205 16:41:17.316626 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5de6b2a4-6604-409e-8bd2-c74eb85ae51e","Type":"ContainerStarted","Data":"e86262e87ea9a6655ab9cf114b1733f91641e40e2cdfa2e57030006c0b78fc5c"} Dec 05 16:41:17 crc kubenswrapper[4778]: I1205 16:41:17.317080 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:17 crc kubenswrapper[4778]: I1205 16:41:17.364274 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.3642508429999998 podStartE2EDuration="2.364250843s" podCreationTimestamp="2025-12-05 16:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:41:17.328763795 +0000 UTC m=+2764.432560195" watchObservedRunningTime="2025-12-05 16:41:17.364250843 +0000 UTC m=+2764.468047233" Dec 05 16:41:17 crc kubenswrapper[4778]: I1205 16:41:17.371645 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.371628385 podStartE2EDuration="2.371628385s" podCreationTimestamp="2025-12-05 16:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:41:17.350556889 +0000 UTC m=+2764.454353269" watchObservedRunningTime="2025-12-05 16:41:17.371628385 +0000 UTC m=+2764.475424765" Dec 05 16:41:17 crc kubenswrapper[4778]: I1205 16:41:17.387175 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.387154418 podStartE2EDuration="2.387154418s" podCreationTimestamp="2025-12-05 16:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:41:17.377753482 +0000 UTC m=+2764.481549862" watchObservedRunningTime="2025-12-05 16:41:17.387154418 +0000 UTC m=+2764.490950798" Dec 05 16:41:17 crc kubenswrapper[4778]: I1205 16:41:17.776345 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/0.log" Dec 05 16:41:18 crc kubenswrapper[4778]: I1205 16:41:18.971759 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/0.log" Dec 05 16:41:19 crc kubenswrapper[4778]: I1205 16:41:19.548904 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:20 crc kubenswrapper[4778]: I1205 16:41:20.173576 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/0.log" Dec 05 16:41:20 crc kubenswrapper[4778]: I1205 16:41:20.354533 4778 generic.go:334] "Generic (PLEG): container finished" podID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerID="aa19eb31daa2f88c09cfe33cbca6449ef78c8caa487fb711ebf4ef750f876258" exitCode=1 Dec 05 16:41:20 crc kubenswrapper[4778]: I1205 16:41:20.354668 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerDied","Data":"aa19eb31daa2f88c09cfe33cbca6449ef78c8caa487fb711ebf4ef750f876258"} Dec 05 16:41:20 crc kubenswrapper[4778]: I1205 16:41:20.374097 4778 scope.go:117] "RemoveContainer" containerID="aa19eb31daa2f88c09cfe33cbca6449ef78c8caa487fb711ebf4ef750f876258" Dec 05 16:41:20 crc kubenswrapper[4778]: I1205 16:41:20.945527 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:20 crc kubenswrapper[4778]: I1205 16:41:20.992885 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:21 crc kubenswrapper[4778]: I1205 16:41:21.388948 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerStarted","Data":"ec19d5da53abfd0ce5251d5ff64dee0039eab8ca5dffb09425d40772e621f828"} Dec 05 16:41:21 crc kubenswrapper[4778]: I1205 16:41:21.398860 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:22 crc kubenswrapper[4778]: I1205 16:41:22.614763 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:23 crc kubenswrapper[4778]: I1205 16:41:23.408912 4778 generic.go:334] "Generic (PLEG): container finished" podID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerID="ec19d5da53abfd0ce5251d5ff64dee0039eab8ca5dffb09425d40772e621f828" exitCode=1 Dec 05 16:41:23 crc kubenswrapper[4778]: I1205 16:41:23.408958 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerDied","Data":"ec19d5da53abfd0ce5251d5ff64dee0039eab8ca5dffb09425d40772e621f828"} Dec 05 16:41:23 crc kubenswrapper[4778]: I1205 16:41:23.408994 4778 scope.go:117] "RemoveContainer" containerID="aa19eb31daa2f88c09cfe33cbca6449ef78c8caa487fb711ebf4ef750f876258" Dec 05 16:41:23 crc kubenswrapper[4778]: I1205 16:41:23.409582 4778 scope.go:117] "RemoveContainer" containerID="ec19d5da53abfd0ce5251d5ff64dee0039eab8ca5dffb09425d40772e621f828" Dec 05 16:41:23 crc kubenswrapper[4778]: E1205 16:41:23.409782 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:41:23 crc kubenswrapper[4778]: I1205 16:41:23.811088 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:24 crc kubenswrapper[4778]: I1205 16:41:24.987790 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.120743 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q7mpj"] Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.122827 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.135080 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q7mpj"] Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.226520 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-catalog-content\") pod \"community-operators-q7mpj\" (UID: \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\") " pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.226856 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2d62\" (UniqueName: \"kubernetes.io/projected/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-kube-api-access-p2d62\") pod \"community-operators-q7mpj\" (UID: \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\") " pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.226909 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-utilities\") pod \"community-operators-q7mpj\" (UID: \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\") " pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.328110 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-catalog-content\") pod \"community-operators-q7mpj\" (UID: \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\") " pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.328184 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2d62\" (UniqueName: \"kubernetes.io/projected/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-kube-api-access-p2d62\") pod \"community-operators-q7mpj\" (UID: \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\") " pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.328238 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-utilities\") pod \"community-operators-q7mpj\" (UID: \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\") " pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.328750 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-catalog-content\") pod \"community-operators-q7mpj\" (UID: \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\") " pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.328780 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-utilities\") pod \"community-operators-q7mpj\" (UID: \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\") " pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.360396 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2d62\" (UniqueName: \"kubernetes.io/projected/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-kube-api-access-p2d62\") pod \"community-operators-q7mpj\" (UID: \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\") " pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.450582 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.938191 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.938409 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.938989 4778 scope.go:117] "RemoveContainer" containerID="ec19d5da53abfd0ce5251d5ff64dee0039eab8ca5dffb09425d40772e621f828" Dec 05 16:41:25 crc kubenswrapper[4778]: E1205 16:41:25.939418 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.942127 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q7mpj"] Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.947499 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.955883 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:25 crc kubenswrapper[4778]: I1205 16:41:25.992997 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:26 crc kubenswrapper[4778]: I1205 16:41:26.019114 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:26 crc kubenswrapper[4778]: I1205 16:41:26.157873 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:26 crc kubenswrapper[4778]: I1205 16:41:26.434688 4778 generic.go:334] "Generic (PLEG): container finished" podID="3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" containerID="f644446174e2c43a5f641c7a5a8a16665f3e6febbb820bfb3e320b42df9b0fd8" exitCode=0 Dec 05 16:41:26 crc kubenswrapper[4778]: I1205 16:41:26.434740 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7mpj" event={"ID":"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5","Type":"ContainerDied","Data":"f644446174e2c43a5f641c7a5a8a16665f3e6febbb820bfb3e320b42df9b0fd8"} Dec 05 16:41:26 crc kubenswrapper[4778]: I1205 16:41:26.435030 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7mpj" event={"ID":"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5","Type":"ContainerStarted","Data":"4d9cbbccafb05e53cd9ee092d3fc3ac4bc9c026b00359803814b891e884c9e6d"} Dec 05 16:41:26 crc kubenswrapper[4778]: I1205 16:41:26.437157 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:41:26 crc kubenswrapper[4778]: I1205 16:41:26.440313 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:41:26 crc kubenswrapper[4778]: I1205 16:41:26.464798 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:41:27 crc kubenswrapper[4778]: I1205 16:41:27.367764 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:27 crc kubenswrapper[4778]: I1205 16:41:27.444104 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7mpj" event={"ID":"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5","Type":"ContainerStarted","Data":"7faf5555408e64ff13a946226c4e862dbc55478f854801711f3fda01624137a0"} Dec 05 16:41:28 crc kubenswrapper[4778]: I1205 16:41:28.454579 4778 generic.go:334] "Generic (PLEG): container finished" podID="3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" containerID="7faf5555408e64ff13a946226c4e862dbc55478f854801711f3fda01624137a0" exitCode=0 Dec 05 16:41:28 crc kubenswrapper[4778]: I1205 16:41:28.454688 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7mpj" event={"ID":"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5","Type":"ContainerDied","Data":"7faf5555408e64ff13a946226c4e862dbc55478f854801711f3fda01624137a0"} Dec 05 16:41:28 crc kubenswrapper[4778]: I1205 16:41:28.537069 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:29 crc kubenswrapper[4778]: I1205 16:41:29.465592 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7mpj" event={"ID":"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5","Type":"ContainerStarted","Data":"ac0bf749fbd8b770af1e1e1e45ab8229571b1585f79bea69bde0bef4fea7aed8"} Dec 05 16:41:29 crc kubenswrapper[4778]: I1205 16:41:29.717970 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:30 crc kubenswrapper[4778]: I1205 16:41:30.900047 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:32 crc kubenswrapper[4778]: I1205 16:41:32.118481 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:33 crc kubenswrapper[4778]: I1205 16:41:33.300866 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:34 crc kubenswrapper[4778]: I1205 16:41:34.499344 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:35 crc kubenswrapper[4778]: I1205 16:41:35.451742 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:35 crc kubenswrapper[4778]: I1205 16:41:35.452098 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:35 crc kubenswrapper[4778]: I1205 16:41:35.525798 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:35 crc kubenswrapper[4778]: I1205 16:41:35.553707 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q7mpj" podStartSLOduration=8.161412956 podStartE2EDuration="10.553691201s" podCreationTimestamp="2025-12-05 16:41:25 +0000 UTC" firstStartedPulling="2025-12-05 16:41:26.436950997 +0000 UTC m=+2773.540747377" lastFinishedPulling="2025-12-05 16:41:28.829229242 +0000 UTC m=+2775.933025622" observedRunningTime="2025-12-05 16:41:29.486110501 +0000 UTC m=+2776.589906901" watchObservedRunningTime="2025-12-05 16:41:35.553691201 +0000 UTC m=+2782.657487581" Dec 05 16:41:35 crc kubenswrapper[4778]: I1205 16:41:35.626264 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:35 crc kubenswrapper[4778]: I1205 16:41:35.649776 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:36 crc kubenswrapper[4778]: I1205 16:41:36.825046 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:37 crc kubenswrapper[4778]: I1205 16:41:37.249587 4778 scope.go:117] "RemoveContainer" containerID="ec19d5da53abfd0ce5251d5ff64dee0039eab8ca5dffb09425d40772e621f828" Dec 05 16:41:38 crc kubenswrapper[4778]: I1205 16:41:38.004565 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/1.log" Dec 05 16:41:38 crc kubenswrapper[4778]: I1205 16:41:38.540401 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerStarted","Data":"53996c4f39cf757fd0930b71f58df0b85158fc5ad15786f96fd3688e1d264ea0"} Dec 05 16:41:39 crc kubenswrapper[4778]: I1205 16:41:39.116525 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q7mpj"] Dec 05 16:41:39 crc kubenswrapper[4778]: I1205 16:41:39.117023 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q7mpj" podUID="3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" containerName="registry-server" containerID="cri-o://ac0bf749fbd8b770af1e1e1e45ab8229571b1585f79bea69bde0bef4fea7aed8" gracePeriod=2 Dec 05 16:41:39 crc kubenswrapper[4778]: I1205 16:41:39.170958 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:39 crc kubenswrapper[4778]: I1205 16:41:39.552641 4778 generic.go:334] "Generic (PLEG): container finished" podID="3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" containerID="ac0bf749fbd8b770af1e1e1e45ab8229571b1585f79bea69bde0bef4fea7aed8" exitCode=0 Dec 05 16:41:39 crc kubenswrapper[4778]: I1205 16:41:39.552687 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7mpj" event={"ID":"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5","Type":"ContainerDied","Data":"ac0bf749fbd8b770af1e1e1e45ab8229571b1585f79bea69bde0bef4fea7aed8"} Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.197781 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.347193 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.376670 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-catalog-content\") pod \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\" (UID: \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\") " Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.376779 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2d62\" (UniqueName: \"kubernetes.io/projected/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-kube-api-access-p2d62\") pod \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\" (UID: \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\") " Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.376862 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-utilities\") pod \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\" (UID: \"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5\") " Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.381241 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-utilities" (OuterVolumeSpecName: "utilities") pod "3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" (UID: "3eb86da0-3ae3-403e-808e-2dd41b1fe4c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.392582 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-kube-api-access-p2d62" (OuterVolumeSpecName: "kube-api-access-p2d62") pod "3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" (UID: "3eb86da0-3ae3-403e-808e-2dd41b1fe4c5"). InnerVolumeSpecName "kube-api-access-p2d62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.435786 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" (UID: "3eb86da0-3ae3-403e-808e-2dd41b1fe4c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.478287 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2d62\" (UniqueName: \"kubernetes.io/projected/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-kube-api-access-p2d62\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.478315 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.478325 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.576862 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7mpj" event={"ID":"3eb86da0-3ae3-403e-808e-2dd41b1fe4c5","Type":"ContainerDied","Data":"4d9cbbccafb05e53cd9ee092d3fc3ac4bc9c026b00359803814b891e884c9e6d"} Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.576929 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q7mpj" Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.576946 4778 scope.go:117] "RemoveContainer" containerID="ac0bf749fbd8b770af1e1e1e45ab8229571b1585f79bea69bde0bef4fea7aed8" Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.602807 4778 scope.go:117] "RemoveContainer" containerID="7faf5555408e64ff13a946226c4e862dbc55478f854801711f3fda01624137a0" Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.678558 4778 scope.go:117] "RemoveContainer" containerID="f644446174e2c43a5f641c7a5a8a16665f3e6febbb820bfb3e320b42df9b0fd8" Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.678997 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q7mpj"] Dec 05 16:41:40 crc kubenswrapper[4778]: I1205 16:41:40.685499 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q7mpj"] Dec 05 16:41:41 crc kubenswrapper[4778]: I1205 16:41:41.259248 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" path="/var/lib/kubelet/pods/3eb86da0-3ae3-403e-808e-2dd41b1fe4c5/volumes" Dec 05 16:41:41 crc kubenswrapper[4778]: I1205 16:41:41.537906 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:41 crc kubenswrapper[4778]: I1205 16:41:41.589569 4778 generic.go:334] "Generic (PLEG): container finished" podID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerID="53996c4f39cf757fd0930b71f58df0b85158fc5ad15786f96fd3688e1d264ea0" exitCode=1 Dec 05 16:41:41 crc kubenswrapper[4778]: I1205 16:41:41.589608 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerDied","Data":"53996c4f39cf757fd0930b71f58df0b85158fc5ad15786f96fd3688e1d264ea0"} Dec 05 16:41:41 crc kubenswrapper[4778]: I1205 16:41:41.589637 4778 scope.go:117] "RemoveContainer" containerID="ec19d5da53abfd0ce5251d5ff64dee0039eab8ca5dffb09425d40772e621f828" Dec 05 16:41:41 crc kubenswrapper[4778]: I1205 16:41:41.590381 4778 scope.go:117] "RemoveContainer" containerID="53996c4f39cf757fd0930b71f58df0b85158fc5ad15786f96fd3688e1d264ea0" Dec 05 16:41:41 crc kubenswrapper[4778]: E1205 16:41:41.590711 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:41:42 crc kubenswrapper[4778]: I1205 16:41:42.747324 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:43 crc kubenswrapper[4778]: I1205 16:41:43.922311 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:45 crc kubenswrapper[4778]: I1205 16:41:45.127753 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:45 crc kubenswrapper[4778]: I1205 16:41:45.936874 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:45 crc kubenswrapper[4778]: I1205 16:41:45.936931 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:45 crc kubenswrapper[4778]: I1205 16:41:45.936948 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:45 crc kubenswrapper[4778]: I1205 16:41:45.936960 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:41:45 crc kubenswrapper[4778]: I1205 16:41:45.937595 4778 scope.go:117] "RemoveContainer" containerID="53996c4f39cf757fd0930b71f58df0b85158fc5ad15786f96fd3688e1d264ea0" Dec 05 16:41:45 crc kubenswrapper[4778]: E1205 16:41:45.937924 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:41:46 crc kubenswrapper[4778]: I1205 16:41:46.309856 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:47 crc kubenswrapper[4778]: I1205 16:41:47.558549 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:48 crc kubenswrapper[4778]: I1205 16:41:48.807218 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:50 crc kubenswrapper[4778]: I1205 16:41:50.074304 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:51 crc kubenswrapper[4778]: I1205 16:41:51.310518 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:52 crc kubenswrapper[4778]: I1205 16:41:52.483831 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:53 crc kubenswrapper[4778]: I1205 16:41:53.670249 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:54 crc kubenswrapper[4778]: I1205 16:41:54.900343 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:56 crc kubenswrapper[4778]: I1205 16:41:56.125933 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:57 crc kubenswrapper[4778]: I1205 16:41:57.326188 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:58 crc kubenswrapper[4778]: I1205 16:41:58.570096 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:41:59 crc kubenswrapper[4778]: I1205 16:41:59.747040 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:42:00 crc kubenswrapper[4778]: I1205 16:42:00.249080 4778 scope.go:117] "RemoveContainer" containerID="53996c4f39cf757fd0930b71f58df0b85158fc5ad15786f96fd3688e1d264ea0" Dec 05 16:42:00 crc kubenswrapper[4778]: E1205 16:42:00.249459 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:42:00 crc kubenswrapper[4778]: I1205 16:42:00.995226 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:42:02 crc kubenswrapper[4778]: I1205 16:42:02.248120 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:42:03 crc kubenswrapper[4778]: I1205 16:42:03.450545 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:42:04 crc kubenswrapper[4778]: I1205 16:42:04.697672 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:42:05 crc kubenswrapper[4778]: I1205 16:42:05.938386 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:42:07 crc kubenswrapper[4778]: I1205 16:42:07.134335 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:42:08 crc kubenswrapper[4778]: I1205 16:42:08.304541 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:42:09 crc kubenswrapper[4778]: I1205 16:42:09.485527 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:42:10 crc kubenswrapper[4778]: I1205 16:42:10.728079 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:42:11 crc kubenswrapper[4778]: I1205 16:42:11.950894 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/2.log" Dec 05 16:42:12 crc kubenswrapper[4778]: I1205 16:42:12.249754 4778 scope.go:117] "RemoveContainer" containerID="53996c4f39cf757fd0930b71f58df0b85158fc5ad15786f96fd3688e1d264ea0" Dec 05 16:42:12 crc kubenswrapper[4778]: I1205 16:42:12.857074 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerStarted","Data":"84e95e045af07a6709e7a5f0740c5d3522a9eeeb219defdd7fb9beb50e643a26"} Dec 05 16:42:13 crc kubenswrapper[4778]: I1205 16:42:13.112341 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:14 crc kubenswrapper[4778]: I1205 16:42:14.263414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:15 crc kubenswrapper[4778]: I1205 16:42:15.455672 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:15 crc kubenswrapper[4778]: I1205 16:42:15.881981 4778 generic.go:334] "Generic (PLEG): container finished" podID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerID="84e95e045af07a6709e7a5f0740c5d3522a9eeeb219defdd7fb9beb50e643a26" exitCode=1 Dec 05 16:42:15 crc kubenswrapper[4778]: I1205 16:42:15.882014 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerDied","Data":"84e95e045af07a6709e7a5f0740c5d3522a9eeeb219defdd7fb9beb50e643a26"} Dec 05 16:42:15 crc kubenswrapper[4778]: I1205 16:42:15.882059 4778 scope.go:117] "RemoveContainer" containerID="53996c4f39cf757fd0930b71f58df0b85158fc5ad15786f96fd3688e1d264ea0" Dec 05 16:42:15 crc kubenswrapper[4778]: I1205 16:42:15.882725 4778 scope.go:117] "RemoveContainer" containerID="84e95e045af07a6709e7a5f0740c5d3522a9eeeb219defdd7fb9beb50e643a26" Dec 05 16:42:15 crc kubenswrapper[4778]: E1205 16:42:15.882981 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:42:15 crc kubenswrapper[4778]: I1205 16:42:15.936698 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:42:15 crc kubenswrapper[4778]: I1205 16:42:15.936787 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:42:15 crc kubenswrapper[4778]: I1205 16:42:15.936804 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:42:15 crc kubenswrapper[4778]: I1205 16:42:15.936821 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:42:16 crc kubenswrapper[4778]: I1205 16:42:16.642396 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:16 crc kubenswrapper[4778]: I1205 16:42:16.891467 4778 scope.go:117] "RemoveContainer" containerID="84e95e045af07a6709e7a5f0740c5d3522a9eeeb219defdd7fb9beb50e643a26" Dec 05 16:42:16 crc kubenswrapper[4778]: E1205 16:42:16.891745 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:42:17 crc kubenswrapper[4778]: I1205 16:42:17.889255 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:19 crc kubenswrapper[4778]: I1205 16:42:19.155142 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:20 crc kubenswrapper[4778]: I1205 16:42:20.328886 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:21 crc kubenswrapper[4778]: I1205 16:42:21.581068 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:22 crc kubenswrapper[4778]: I1205 16:42:22.772983 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:24 crc kubenswrapper[4778]: I1205 16:42:24.003742 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:25 crc kubenswrapper[4778]: I1205 16:42:25.211939 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:26 crc kubenswrapper[4778]: I1205 16:42:26.398944 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:27 crc kubenswrapper[4778]: I1205 16:42:27.250461 4778 scope.go:117] "RemoveContainer" containerID="84e95e045af07a6709e7a5f0740c5d3522a9eeeb219defdd7fb9beb50e643a26" Dec 05 16:42:27 crc kubenswrapper[4778]: E1205 16:42:27.250701 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:42:27 crc kubenswrapper[4778]: I1205 16:42:27.630478 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:28 crc kubenswrapper[4778]: I1205 16:42:28.839864 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:30 crc kubenswrapper[4778]: I1205 16:42:30.027775 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:31 crc kubenswrapper[4778]: I1205 16:42:31.254851 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:32 crc kubenswrapper[4778]: I1205 16:42:32.455488 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:33 crc kubenswrapper[4778]: I1205 16:42:33.415076 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:42:33 crc kubenswrapper[4778]: I1205 16:42:33.415156 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:42:33 crc kubenswrapper[4778]: I1205 16:42:33.663753 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:34 crc kubenswrapper[4778]: I1205 16:42:34.866414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:36 crc kubenswrapper[4778]: I1205 16:42:36.095492 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:37 crc kubenswrapper[4778]: I1205 16:42:37.280584 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:38 crc kubenswrapper[4778]: I1205 16:42:38.482092 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:39 crc kubenswrapper[4778]: I1205 16:42:39.249701 4778 scope.go:117] "RemoveContainer" containerID="84e95e045af07a6709e7a5f0740c5d3522a9eeeb219defdd7fb9beb50e643a26" Dec 05 16:42:39 crc kubenswrapper[4778]: E1205 16:42:39.249979 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:42:39 crc kubenswrapper[4778]: I1205 16:42:39.717678 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:40 crc kubenswrapper[4778]: I1205 16:42:40.937391 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:42 crc kubenswrapper[4778]: I1205 16:42:42.107163 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:43 crc kubenswrapper[4778]: I1205 16:42:43.292267 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:44 crc kubenswrapper[4778]: I1205 16:42:44.470936 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:45 crc kubenswrapper[4778]: I1205 16:42:45.644582 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:46 crc kubenswrapper[4778]: I1205 16:42:46.836337 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:48 crc kubenswrapper[4778]: I1205 16:42:48.041516 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:48 crc kubenswrapper[4778]: I1205 16:42:48.749986 4778 scope.go:117] "RemoveContainer" containerID="af79d8e78be651822ce5095878935fd8c0685a54d25823ee61b559b1832ef62b" Dec 05 16:42:48 crc kubenswrapper[4778]: I1205 16:42:48.788183 4778 scope.go:117] "RemoveContainer" containerID="af4d05c66fc16e524d095921d065b0c4c760e80b9b9e35e42987db9642e6dc63" Dec 05 16:42:48 crc kubenswrapper[4778]: I1205 16:42:48.822584 4778 scope.go:117] "RemoveContainer" containerID="c946218ff5881070cb0218aceb13162a4c809c65d77f11a88834c149b5e58d12" Dec 05 16:42:48 crc kubenswrapper[4778]: I1205 16:42:48.851121 4778 scope.go:117] "RemoveContainer" containerID="8c02c7fb994d4e7c1a3cbe51a221bb427053b507384aa8cc52722451bdf97b18" Dec 05 16:42:48 crc kubenswrapper[4778]: I1205 16:42:48.871515 4778 scope.go:117] "RemoveContainer" containerID="750a95dfb36f3ecd21cb0dfc57217e9d88cad3ab11c7ad900fabde8499f47b2b" Dec 05 16:42:49 crc kubenswrapper[4778]: I1205 16:42:49.226281 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:50 crc kubenswrapper[4778]: I1205 16:42:50.250059 4778 scope.go:117] "RemoveContainer" containerID="84e95e045af07a6709e7a5f0740c5d3522a9eeeb219defdd7fb9beb50e643a26" Dec 05 16:42:50 crc kubenswrapper[4778]: E1205 16:42:50.250323 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:42:50 crc kubenswrapper[4778]: I1205 16:42:50.437175 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:51 crc kubenswrapper[4778]: I1205 16:42:51.645993 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:52 crc kubenswrapper[4778]: I1205 16:42:52.832341 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:54 crc kubenswrapper[4778]: I1205 16:42:54.042817 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:55 crc kubenswrapper[4778]: I1205 16:42:55.251841 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:56 crc kubenswrapper[4778]: I1205 16:42:56.453695 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:57 crc kubenswrapper[4778]: I1205 16:42:57.653814 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:42:58 crc kubenswrapper[4778]: I1205 16:42:58.843633 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:43:00 crc kubenswrapper[4778]: I1205 16:43:00.037689 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:43:01 crc kubenswrapper[4778]: I1205 16:43:01.230765 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:43:02 crc kubenswrapper[4778]: I1205 16:43:02.439989 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:43:03 crc kubenswrapper[4778]: I1205 16:43:03.414467 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:43:03 crc kubenswrapper[4778]: I1205 16:43:03.414844 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:43:03 crc kubenswrapper[4778]: I1205 16:43:03.626903 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:43:04 crc kubenswrapper[4778]: I1205 16:43:04.864002 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:43:05 crc kubenswrapper[4778]: I1205 16:43:05.249915 4778 scope.go:117] "RemoveContainer" containerID="84e95e045af07a6709e7a5f0740c5d3522a9eeeb219defdd7fb9beb50e643a26" Dec 05 16:43:06 crc kubenswrapper[4778]: I1205 16:43:06.047539 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/3.log" Dec 05 16:43:06 crc kubenswrapper[4778]: I1205 16:43:06.266576 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerStarted","Data":"c75888caf7514aa3911d64f34af5a0ba334ab977e1a9963f59b939e955861f21"} Dec 05 16:43:07 crc kubenswrapper[4778]: I1205 16:43:07.234515 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:08 crc kubenswrapper[4778]: I1205 16:43:08.283539 4778 generic.go:334] "Generic (PLEG): container finished" podID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerID="c75888caf7514aa3911d64f34af5a0ba334ab977e1a9963f59b939e955861f21" exitCode=1 Dec 05 16:43:08 crc kubenswrapper[4778]: I1205 16:43:08.283619 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerDied","Data":"c75888caf7514aa3911d64f34af5a0ba334ab977e1a9963f59b939e955861f21"} Dec 05 16:43:08 crc kubenswrapper[4778]: I1205 16:43:08.285165 4778 scope.go:117] "RemoveContainer" containerID="84e95e045af07a6709e7a5f0740c5d3522a9eeeb219defdd7fb9beb50e643a26" Dec 05 16:43:08 crc kubenswrapper[4778]: I1205 16:43:08.285823 4778 scope.go:117] "RemoveContainer" containerID="c75888caf7514aa3911d64f34af5a0ba334ab977e1a9963f59b939e955861f21" Dec 05 16:43:08 crc kubenswrapper[4778]: E1205 16:43:08.286113 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:43:08 crc kubenswrapper[4778]: I1205 16:43:08.442317 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:09 crc kubenswrapper[4778]: I1205 16:43:09.639260 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:10 crc kubenswrapper[4778]: I1205 16:43:10.805398 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:12 crc kubenswrapper[4778]: I1205 16:43:12.004038 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:13 crc kubenswrapper[4778]: I1205 16:43:13.216261 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:14 crc kubenswrapper[4778]: I1205 16:43:14.406706 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:15 crc kubenswrapper[4778]: I1205 16:43:15.592716 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:15 crc kubenswrapper[4778]: I1205 16:43:15.937417 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:43:15 crc kubenswrapper[4778]: I1205 16:43:15.937502 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:43:15 crc kubenswrapper[4778]: I1205 16:43:15.937521 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:43:15 crc kubenswrapper[4778]: I1205 16:43:15.937537 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:43:15 crc kubenswrapper[4778]: I1205 16:43:15.938448 4778 scope.go:117] "RemoveContainer" containerID="c75888caf7514aa3911d64f34af5a0ba334ab977e1a9963f59b939e955861f21" Dec 05 16:43:15 crc kubenswrapper[4778]: E1205 16:43:15.938773 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:43:16 crc kubenswrapper[4778]: I1205 16:43:16.793881 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:18 crc kubenswrapper[4778]: I1205 16:43:18.047549 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:19 crc kubenswrapper[4778]: I1205 16:43:19.280573 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:20 crc kubenswrapper[4778]: I1205 16:43:20.488385 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:21 crc kubenswrapper[4778]: I1205 16:43:21.738680 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:22 crc kubenswrapper[4778]: I1205 16:43:22.932958 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:24 crc kubenswrapper[4778]: I1205 16:43:24.116489 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:25 crc kubenswrapper[4778]: I1205 16:43:25.352901 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:26 crc kubenswrapper[4778]: I1205 16:43:26.605382 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:27 crc kubenswrapper[4778]: I1205 16:43:27.249775 4778 scope.go:117] "RemoveContainer" containerID="c75888caf7514aa3911d64f34af5a0ba334ab977e1a9963f59b939e955861f21" Dec 05 16:43:27 crc kubenswrapper[4778]: E1205 16:43:27.250460 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:43:27 crc kubenswrapper[4778]: I1205 16:43:27.822867 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:29 crc kubenswrapper[4778]: I1205 16:43:29.040563 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:30 crc kubenswrapper[4778]: I1205 16:43:30.221046 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:31 crc kubenswrapper[4778]: I1205 16:43:31.453033 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:32 crc kubenswrapper[4778]: I1205 16:43:32.708862 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:33 crc kubenswrapper[4778]: I1205 16:43:33.414525 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:43:33 crc kubenswrapper[4778]: I1205 16:43:33.414600 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:43:33 crc kubenswrapper[4778]: I1205 16:43:33.414656 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:43:33 crc kubenswrapper[4778]: I1205 16:43:33.415461 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:43:33 crc kubenswrapper[4778]: I1205 16:43:33.415531 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" gracePeriod=600 Dec 05 16:43:33 crc kubenswrapper[4778]: E1205 16:43:33.559089 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:43:33 crc kubenswrapper[4778]: I1205 16:43:33.921210 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:34 crc kubenswrapper[4778]: I1205 16:43:34.534463 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" exitCode=0 Dec 05 16:43:34 crc kubenswrapper[4778]: I1205 16:43:34.534567 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5"} Dec 05 16:43:34 crc kubenswrapper[4778]: I1205 16:43:34.534825 4778 scope.go:117] "RemoveContainer" containerID="0252113aba7ac5b30976ddfa801607e96d4386b08fb7868f7c833a29836c7593" Dec 05 16:43:34 crc kubenswrapper[4778]: I1205 16:43:34.535848 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:43:34 crc kubenswrapper[4778]: E1205 16:43:34.536251 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:43:35 crc kubenswrapper[4778]: I1205 16:43:35.140551 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:36 crc kubenswrapper[4778]: I1205 16:43:36.287485 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:37 crc kubenswrapper[4778]: I1205 16:43:37.470667 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:38 crc kubenswrapper[4778]: I1205 16:43:38.636405 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:39 crc kubenswrapper[4778]: I1205 16:43:39.864395 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:41 crc kubenswrapper[4778]: I1205 16:43:41.051166 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:42 crc kubenswrapper[4778]: I1205 16:43:42.234514 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:42 crc kubenswrapper[4778]: I1205 16:43:42.249655 4778 scope.go:117] "RemoveContainer" containerID="c75888caf7514aa3911d64f34af5a0ba334ab977e1a9963f59b939e955861f21" Dec 05 16:43:42 crc kubenswrapper[4778]: E1205 16:43:42.249854 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:43:43 crc kubenswrapper[4778]: I1205 16:43:43.411642 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:44 crc kubenswrapper[4778]: I1205 16:43:44.624706 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:46 crc kubenswrapper[4778]: I1205 16:43:46.121002 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:47 crc kubenswrapper[4778]: I1205 16:43:47.378882 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:48 crc kubenswrapper[4778]: I1205 16:43:48.602689 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:49 crc kubenswrapper[4778]: I1205 16:43:49.248995 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:43:49 crc kubenswrapper[4778]: E1205 16:43:49.249240 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:43:49 crc kubenswrapper[4778]: I1205 16:43:49.786293 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:50 crc kubenswrapper[4778]: I1205 16:43:50.989319 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:52 crc kubenswrapper[4778]: I1205 16:43:52.189184 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:53 crc kubenswrapper[4778]: I1205 16:43:53.391608 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:54 crc kubenswrapper[4778]: I1205 16:43:54.590508 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:55 crc kubenswrapper[4778]: I1205 16:43:55.797087 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:56 crc kubenswrapper[4778]: I1205 16:43:56.249285 4778 scope.go:117] "RemoveContainer" containerID="c75888caf7514aa3911d64f34af5a0ba334ab977e1a9963f59b939e955861f21" Dec 05 16:43:56 crc kubenswrapper[4778]: E1205 16:43:56.249878 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:43:57 crc kubenswrapper[4778]: I1205 16:43:57.005001 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:58 crc kubenswrapper[4778]: I1205 16:43:58.203518 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:43:59 crc kubenswrapper[4778]: I1205 16:43:59.416186 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:00 crc kubenswrapper[4778]: I1205 16:44:00.597720 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:01 crc kubenswrapper[4778]: I1205 16:44:01.822277 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:03 crc kubenswrapper[4778]: I1205 16:44:03.061734 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:04 crc kubenswrapper[4778]: I1205 16:44:04.250312 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:44:04 crc kubenswrapper[4778]: E1205 16:44:04.250575 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:44:04 crc kubenswrapper[4778]: I1205 16:44:04.292256 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:05 crc kubenswrapper[4778]: I1205 16:44:05.516928 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:06 crc kubenswrapper[4778]: I1205 16:44:06.719810 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:07 crc kubenswrapper[4778]: I1205 16:44:07.902070 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:09 crc kubenswrapper[4778]: I1205 16:44:09.113916 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:10 crc kubenswrapper[4778]: I1205 16:44:10.310112 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:11 crc kubenswrapper[4778]: I1205 16:44:11.249453 4778 scope.go:117] "RemoveContainer" containerID="c75888caf7514aa3911d64f34af5a0ba334ab977e1a9963f59b939e955861f21" Dec 05 16:44:11 crc kubenswrapper[4778]: E1205 16:44:11.249751 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:44:11 crc kubenswrapper[4778]: I1205 16:44:11.524580 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:12 crc kubenswrapper[4778]: I1205 16:44:12.699603 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:13 crc kubenswrapper[4778]: I1205 16:44:13.903351 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:15 crc kubenswrapper[4778]: I1205 16:44:15.106182 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:16 crc kubenswrapper[4778]: I1205 16:44:16.249574 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:44:16 crc kubenswrapper[4778]: E1205 16:44:16.249944 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:44:16 crc kubenswrapper[4778]: I1205 16:44:16.309968 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:17 crc kubenswrapper[4778]: I1205 16:44:17.489620 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:18 crc kubenswrapper[4778]: I1205 16:44:18.691620 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:19 crc kubenswrapper[4778]: I1205 16:44:19.875627 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:21 crc kubenswrapper[4778]: I1205 16:44:21.080430 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:22 crc kubenswrapper[4778]: I1205 16:44:22.250320 4778 scope.go:117] "RemoveContainer" containerID="c75888caf7514aa3911d64f34af5a0ba334ab977e1a9963f59b939e955861f21" Dec 05 16:44:22 crc kubenswrapper[4778]: E1205 16:44:22.250807 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:44:22 crc kubenswrapper[4778]: I1205 16:44:22.320248 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:23 crc kubenswrapper[4778]: I1205 16:44:23.507679 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:24 crc kubenswrapper[4778]: I1205 16:44:24.673724 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:25 crc kubenswrapper[4778]: I1205 16:44:25.879701 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:27 crc kubenswrapper[4778]: I1205 16:44:27.051526 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:28 crc kubenswrapper[4778]: I1205 16:44:28.254547 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:29 crc kubenswrapper[4778]: I1205 16:44:29.456839 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:30 crc kubenswrapper[4778]: I1205 16:44:30.654475 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:31 crc kubenswrapper[4778]: I1205 16:44:31.249550 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:44:31 crc kubenswrapper[4778]: E1205 16:44:31.249748 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:44:31 crc kubenswrapper[4778]: I1205 16:44:31.904537 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:33 crc kubenswrapper[4778]: I1205 16:44:33.113641 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:34 crc kubenswrapper[4778]: I1205 16:44:34.317128 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:35 crc kubenswrapper[4778]: I1205 16:44:35.521664 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:36 crc kubenswrapper[4778]: I1205 16:44:36.249194 4778 scope.go:117] "RemoveContainer" containerID="c75888caf7514aa3911d64f34af5a0ba334ab977e1a9963f59b939e955861f21" Dec 05 16:44:36 crc kubenswrapper[4778]: I1205 16:44:36.788985 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/4.log" Dec 05 16:44:37 crc kubenswrapper[4778]: I1205 16:44:37.103299 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerStarted","Data":"af5ad27761791f2b4a7e5e67001d35d92f2f607f9240e9d8ee3b7c8bbabefa51"} Dec 05 16:44:37 crc kubenswrapper[4778]: I1205 16:44:37.964556 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:39 crc kubenswrapper[4778]: I1205 16:44:39.154040 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:40 crc kubenswrapper[4778]: I1205 16:44:40.151399 4778 generic.go:334] "Generic (PLEG): container finished" podID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerID="af5ad27761791f2b4a7e5e67001d35d92f2f607f9240e9d8ee3b7c8bbabefa51" exitCode=1 Dec 05 16:44:40 crc kubenswrapper[4778]: I1205 16:44:40.151415 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerDied","Data":"af5ad27761791f2b4a7e5e67001d35d92f2f607f9240e9d8ee3b7c8bbabefa51"} Dec 05 16:44:40 crc kubenswrapper[4778]: I1205 16:44:40.151841 4778 scope.go:117] "RemoveContainer" containerID="c75888caf7514aa3911d64f34af5a0ba334ab977e1a9963f59b939e955861f21" Dec 05 16:44:40 crc kubenswrapper[4778]: I1205 16:44:40.152473 4778 scope.go:117] "RemoveContainer" containerID="af5ad27761791f2b4a7e5e67001d35d92f2f607f9240e9d8ee3b7c8bbabefa51" Dec 05 16:44:40 crc kubenswrapper[4778]: E1205 16:44:40.152755 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:44:40 crc kubenswrapper[4778]: I1205 16:44:40.370835 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:41 crc kubenswrapper[4778]: I1205 16:44:41.541396 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:42 crc kubenswrapper[4778]: I1205 16:44:42.735387 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:43 crc kubenswrapper[4778]: I1205 16:44:43.913082 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:45 crc kubenswrapper[4778]: I1205 16:44:45.119373 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:45 crc kubenswrapper[4778]: I1205 16:44:45.250132 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:44:45 crc kubenswrapper[4778]: E1205 16:44:45.250493 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:44:45 crc kubenswrapper[4778]: I1205 16:44:45.937321 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:44:45 crc kubenswrapper[4778]: I1205 16:44:45.937393 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:44:45 crc kubenswrapper[4778]: I1205 16:44:45.937407 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:44:45 crc kubenswrapper[4778]: I1205 16:44:45.937419 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:44:45 crc kubenswrapper[4778]: I1205 16:44:45.938106 4778 scope.go:117] "RemoveContainer" containerID="af5ad27761791f2b4a7e5e67001d35d92f2f607f9240e9d8ee3b7c8bbabefa51" Dec 05 16:44:45 crc kubenswrapper[4778]: E1205 16:44:45.938477 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:44:46 crc kubenswrapper[4778]: I1205 16:44:46.328389 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:47 crc kubenswrapper[4778]: I1205 16:44:47.533541 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:48 crc kubenswrapper[4778]: I1205 16:44:48.708542 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:49 crc kubenswrapper[4778]: I1205 16:44:49.859313 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:51 crc kubenswrapper[4778]: I1205 16:44:51.043322 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:52 crc kubenswrapper[4778]: I1205 16:44:52.241816 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:53 crc kubenswrapper[4778]: I1205 16:44:53.457707 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:54 crc kubenswrapper[4778]: I1205 16:44:54.663638 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:55 crc kubenswrapper[4778]: I1205 16:44:55.871702 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:57 crc kubenswrapper[4778]: I1205 16:44:57.071246 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:57 crc kubenswrapper[4778]: I1205 16:44:57.249751 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:44:57 crc kubenswrapper[4778]: E1205 16:44:57.250000 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:44:58 crc kubenswrapper[4778]: I1205 16:44:58.288181 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:44:59 crc kubenswrapper[4778]: I1205 16:44:59.249419 4778 scope.go:117] "RemoveContainer" containerID="af5ad27761791f2b4a7e5e67001d35d92f2f607f9240e9d8ee3b7c8bbabefa51" Dec 05 16:44:59 crc kubenswrapper[4778]: E1205 16:44:59.250036 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:44:59 crc kubenswrapper[4778]: I1205 16:44:59.538267 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.169621 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c"] Dec 05 16:45:00 crc kubenswrapper[4778]: E1205 16:45:00.170329 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" containerName="extract-content" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.170348 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" containerName="extract-content" Dec 05 16:45:00 crc kubenswrapper[4778]: E1205 16:45:00.170380 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" containerName="extract-utilities" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.170388 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" containerName="extract-utilities" Dec 05 16:45:00 crc kubenswrapper[4778]: E1205 16:45:00.170402 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" containerName="registry-server" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.170408 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" containerName="registry-server" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.170557 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb86da0-3ae3-403e-808e-2dd41b1fe4c5" containerName="registry-server" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.171238 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.173971 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.177671 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.195609 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c"] Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.304807 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa14b9d4-49a5-44a7-b856-d67c99048f2a-config-volume\") pod \"collect-profiles-29415885-bnq6c\" (UID: \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.305046 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9n5p\" (UniqueName: \"kubernetes.io/projected/fa14b9d4-49a5-44a7-b856-d67c99048f2a-kube-api-access-q9n5p\") pod \"collect-profiles-29415885-bnq6c\" (UID: \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.305116 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa14b9d4-49a5-44a7-b856-d67c99048f2a-secret-volume\") pod \"collect-profiles-29415885-bnq6c\" (UID: \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.406668 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa14b9d4-49a5-44a7-b856-d67c99048f2a-config-volume\") pod \"collect-profiles-29415885-bnq6c\" (UID: \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.406850 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9n5p\" (UniqueName: \"kubernetes.io/projected/fa14b9d4-49a5-44a7-b856-d67c99048f2a-kube-api-access-q9n5p\") pod \"collect-profiles-29415885-bnq6c\" (UID: \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.406911 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa14b9d4-49a5-44a7-b856-d67c99048f2a-secret-volume\") pod \"collect-profiles-29415885-bnq6c\" (UID: \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.407744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa14b9d4-49a5-44a7-b856-d67c99048f2a-config-volume\") pod \"collect-profiles-29415885-bnq6c\" (UID: \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.413465 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa14b9d4-49a5-44a7-b856-d67c99048f2a-secret-volume\") pod \"collect-profiles-29415885-bnq6c\" (UID: \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.426807 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9n5p\" (UniqueName: \"kubernetes.io/projected/fa14b9d4-49a5-44a7-b856-d67c99048f2a-kube-api-access-q9n5p\") pod \"collect-profiles-29415885-bnq6c\" (UID: \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:00 crc kubenswrapper[4778]: I1205 16:45:00.494693 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:01 crc kubenswrapper[4778]: I1205 16:45:00.718227 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:01 crc kubenswrapper[4778]: I1205 16:45:00.945939 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c"] Dec 05 16:45:01 crc kubenswrapper[4778]: I1205 16:45:01.331201 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" event={"ID":"fa14b9d4-49a5-44a7-b856-d67c99048f2a","Type":"ContainerStarted","Data":"f2a8878f791928d09a1f2dd5628957a7e88d18ca3e77fd1f48ac55412c098b2e"} Dec 05 16:45:01 crc kubenswrapper[4778]: I1205 16:45:01.331528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" event={"ID":"fa14b9d4-49a5-44a7-b856-d67c99048f2a","Type":"ContainerStarted","Data":"506f525071f750d84d5e844f6bb2d7622f4d65f969c9ab34e98d16879a83cd90"} Dec 05 16:45:01 crc kubenswrapper[4778]: I1205 16:45:01.893338 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:02 crc kubenswrapper[4778]: I1205 16:45:02.340949 4778 generic.go:334] "Generic (PLEG): container finished" podID="fa14b9d4-49a5-44a7-b856-d67c99048f2a" containerID="f2a8878f791928d09a1f2dd5628957a7e88d18ca3e77fd1f48ac55412c098b2e" exitCode=0 Dec 05 16:45:02 crc kubenswrapper[4778]: I1205 16:45:02.341009 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" event={"ID":"fa14b9d4-49a5-44a7-b856-d67c99048f2a","Type":"ContainerDied","Data":"f2a8878f791928d09a1f2dd5628957a7e88d18ca3e77fd1f48ac55412c098b2e"} Dec 05 16:45:03 crc kubenswrapper[4778]: I1205 16:45:03.094258 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:03 crc kubenswrapper[4778]: I1205 16:45:03.679585 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:03 crc kubenswrapper[4778]: I1205 16:45:03.761032 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9n5p\" (UniqueName: \"kubernetes.io/projected/fa14b9d4-49a5-44a7-b856-d67c99048f2a-kube-api-access-q9n5p\") pod \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\" (UID: \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\") " Dec 05 16:45:03 crc kubenswrapper[4778]: I1205 16:45:03.761114 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa14b9d4-49a5-44a7-b856-d67c99048f2a-config-volume\") pod \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\" (UID: \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\") " Dec 05 16:45:03 crc kubenswrapper[4778]: I1205 16:45:03.761211 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa14b9d4-49a5-44a7-b856-d67c99048f2a-secret-volume\") pod \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\" (UID: \"fa14b9d4-49a5-44a7-b856-d67c99048f2a\") " Dec 05 16:45:03 crc kubenswrapper[4778]: I1205 16:45:03.762023 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa14b9d4-49a5-44a7-b856-d67c99048f2a-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa14b9d4-49a5-44a7-b856-d67c99048f2a" (UID: "fa14b9d4-49a5-44a7-b856-d67c99048f2a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:45:03 crc kubenswrapper[4778]: I1205 16:45:03.762322 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa14b9d4-49a5-44a7-b856-d67c99048f2a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:45:03 crc kubenswrapper[4778]: I1205 16:45:03.768013 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa14b9d4-49a5-44a7-b856-d67c99048f2a-kube-api-access-q9n5p" (OuterVolumeSpecName: "kube-api-access-q9n5p") pod "fa14b9d4-49a5-44a7-b856-d67c99048f2a" (UID: "fa14b9d4-49a5-44a7-b856-d67c99048f2a"). InnerVolumeSpecName "kube-api-access-q9n5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:45:03 crc kubenswrapper[4778]: I1205 16:45:03.768939 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa14b9d4-49a5-44a7-b856-d67c99048f2a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa14b9d4-49a5-44a7-b856-d67c99048f2a" (UID: "fa14b9d4-49a5-44a7-b856-d67c99048f2a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:45:03 crc kubenswrapper[4778]: I1205 16:45:03.863995 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9n5p\" (UniqueName: \"kubernetes.io/projected/fa14b9d4-49a5-44a7-b856-d67c99048f2a-kube-api-access-q9n5p\") on node \"crc\" DevicePath \"\"" Dec 05 16:45:03 crc kubenswrapper[4778]: I1205 16:45:03.864059 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa14b9d4-49a5-44a7-b856-d67c99048f2a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:45:04 crc kubenswrapper[4778]: I1205 16:45:04.259892 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:04 crc kubenswrapper[4778]: I1205 16:45:04.354087 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" event={"ID":"fa14b9d4-49a5-44a7-b856-d67c99048f2a","Type":"ContainerDied","Data":"506f525071f750d84d5e844f6bb2d7622f4d65f969c9ab34e98d16879a83cd90"} Dec 05 16:45:04 crc kubenswrapper[4778]: I1205 16:45:04.354122 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="506f525071f750d84d5e844f6bb2d7622f4d65f969c9ab34e98d16879a83cd90" Dec 05 16:45:04 crc kubenswrapper[4778]: I1205 16:45:04.354346 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-bnq6c" Dec 05 16:45:04 crc kubenswrapper[4778]: I1205 16:45:04.426163 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr"] Dec 05 16:45:04 crc kubenswrapper[4778]: I1205 16:45:04.433089 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415840-5r8wr"] Dec 05 16:45:05 crc kubenswrapper[4778]: I1205 16:45:05.260879 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60" path="/var/lib/kubelet/pods/1edfd8a5-db6e-4f3e-bdf2-54cf1dce6a60/volumes" Dec 05 16:45:05 crc kubenswrapper[4778]: I1205 16:45:05.445697 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:06 crc kubenswrapper[4778]: I1205 16:45:06.626341 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:07 crc kubenswrapper[4778]: I1205 16:45:07.839419 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:09 crc kubenswrapper[4778]: I1205 16:45:09.039246 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:10 crc kubenswrapper[4778]: I1205 16:45:10.256991 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:11 crc kubenswrapper[4778]: I1205 16:45:11.250151 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:45:11 crc kubenswrapper[4778]: E1205 16:45:11.250392 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:45:11 crc kubenswrapper[4778]: I1205 16:45:11.448046 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:12 crc kubenswrapper[4778]: I1205 16:45:12.249542 4778 scope.go:117] "RemoveContainer" containerID="af5ad27761791f2b4a7e5e67001d35d92f2f607f9240e9d8ee3b7c8bbabefa51" Dec 05 16:45:12 crc kubenswrapper[4778]: E1205 16:45:12.249816 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:45:12 crc kubenswrapper[4778]: I1205 16:45:12.648809 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:13 crc kubenswrapper[4778]: I1205 16:45:13.833982 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:15 crc kubenswrapper[4778]: I1205 16:45:15.049089 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:16 crc kubenswrapper[4778]: I1205 16:45:16.296042 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:17 crc kubenswrapper[4778]: I1205 16:45:17.513256 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:18 crc kubenswrapper[4778]: I1205 16:45:18.718254 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:19 crc kubenswrapper[4778]: I1205 16:45:19.927193 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:21 crc kubenswrapper[4778]: I1205 16:45:21.116197 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:22 crc kubenswrapper[4778]: I1205 16:45:22.338105 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:23 crc kubenswrapper[4778]: I1205 16:45:23.253882 4778 scope.go:117] "RemoveContainer" containerID="af5ad27761791f2b4a7e5e67001d35d92f2f607f9240e9d8ee3b7c8bbabefa51" Dec 05 16:45:23 crc kubenswrapper[4778]: E1205 16:45:23.254408 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:45:23 crc kubenswrapper[4778]: I1205 16:45:23.565314 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:24 crc kubenswrapper[4778]: I1205 16:45:24.249451 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:45:24 crc kubenswrapper[4778]: E1205 16:45:24.249690 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:45:24 crc kubenswrapper[4778]: I1205 16:45:24.783757 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:25 crc kubenswrapper[4778]: I1205 16:45:25.929164 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:27 crc kubenswrapper[4778]: I1205 16:45:27.143315 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:28 crc kubenswrapper[4778]: I1205 16:45:28.324536 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:29 crc kubenswrapper[4778]: I1205 16:45:29.509026 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:30 crc kubenswrapper[4778]: I1205 16:45:30.665909 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:31 crc kubenswrapper[4778]: I1205 16:45:31.905907 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:33 crc kubenswrapper[4778]: I1205 16:45:33.065018 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:34 crc kubenswrapper[4778]: I1205 16:45:34.285007 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:35 crc kubenswrapper[4778]: I1205 16:45:35.469474 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:36 crc kubenswrapper[4778]: I1205 16:45:36.249509 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:45:36 crc kubenswrapper[4778]: E1205 16:45:36.250361 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:45:36 crc kubenswrapper[4778]: I1205 16:45:36.659068 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:37 crc kubenswrapper[4778]: I1205 16:45:37.881955 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:38 crc kubenswrapper[4778]: I1205 16:45:38.250104 4778 scope.go:117] "RemoveContainer" containerID="af5ad27761791f2b4a7e5e67001d35d92f2f607f9240e9d8ee3b7c8bbabefa51" Dec 05 16:45:38 crc kubenswrapper[4778]: E1205 16:45:38.250440 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:45:39 crc kubenswrapper[4778]: I1205 16:45:39.057742 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:40 crc kubenswrapper[4778]: I1205 16:45:40.237237 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:41 crc kubenswrapper[4778]: I1205 16:45:41.408795 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:42 crc kubenswrapper[4778]: I1205 16:45:42.619795 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:43 crc kubenswrapper[4778]: I1205 16:45:43.850692 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:45 crc kubenswrapper[4778]: I1205 16:45:45.008483 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:46 crc kubenswrapper[4778]: I1205 16:45:46.220895 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:47 crc kubenswrapper[4778]: I1205 16:45:47.404904 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:48 crc kubenswrapper[4778]: I1205 16:45:48.611515 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:49 crc kubenswrapper[4778]: I1205 16:45:49.044519 4778 scope.go:117] "RemoveContainer" containerID="f97cc2bd8fb7503e742293e22094fbb1c486e852a0e7f18b848021e492d704cf" Dec 05 16:45:49 crc kubenswrapper[4778]: I1205 16:45:49.249733 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:45:49 crc kubenswrapper[4778]: E1205 16:45:49.250473 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:45:49 crc kubenswrapper[4778]: I1205 16:45:49.798408 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:50 crc kubenswrapper[4778]: I1205 16:45:50.992811 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:52 crc kubenswrapper[4778]: I1205 16:45:52.207701 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:53 crc kubenswrapper[4778]: I1205 16:45:53.256898 4778 scope.go:117] "RemoveContainer" containerID="af5ad27761791f2b4a7e5e67001d35d92f2f607f9240e9d8ee3b7c8bbabefa51" Dec 05 16:45:53 crc kubenswrapper[4778]: E1205 16:45:53.257230 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" Dec 05 16:45:53 crc kubenswrapper[4778]: I1205 16:45:53.423034 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:54 crc kubenswrapper[4778]: I1205 16:45:54.610803 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:55 crc kubenswrapper[4778]: I1205 16:45:55.773028 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:56 crc kubenswrapper[4778]: I1205 16:45:56.972816 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:58 crc kubenswrapper[4778]: I1205 16:45:58.172871 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/watcher-decision-engine/5.log" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.314057 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd"] Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.321291 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lr4hd"] Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.377177 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher2ff6-account-delete-ss5kz"] Dec 05 16:45:59 crc kubenswrapper[4778]: E1205 16:45:59.377559 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa14b9d4-49a5-44a7-b856-d67c99048f2a" containerName="collect-profiles" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.377575 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa14b9d4-49a5-44a7-b856-d67c99048f2a" containerName="collect-profiles" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.377742 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa14b9d4-49a5-44a7-b856-d67c99048f2a" containerName="collect-profiles" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.378269 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.387837 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher2ff6-account-delete-ss5kz"] Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.432009 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.476604 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.476892 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="2e08d7c0-3b89-4afa-804d-e6f87469a631" containerName="watcher-applier" containerID="cri-o://ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d" gracePeriod=30 Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.486248 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00c0a9e-c371-43ee-a9c3-53875562acae-operator-scripts\") pod \"watcher2ff6-account-delete-ss5kz\" (UID: \"b00c0a9e-c371-43ee-a9c3-53875562acae\") " pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.498662 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vtln\" (UniqueName: \"kubernetes.io/projected/b00c0a9e-c371-43ee-a9c3-53875562acae-kube-api-access-6vtln\") pod \"watcher2ff6-account-delete-ss5kz\" (UID: \"b00c0a9e-c371-43ee-a9c3-53875562acae\") " pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.536907 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.537175 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" containerName="watcher-kuttl-api-log" containerID="cri-o://dcb573b79dfd62158fb8b8022f0559878e8c84bc7636832b984ad3484b8db717" gracePeriod=30 Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.537656 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" containerName="watcher-api" containerID="cri-o://486e83d28cd72050c59cf79c3b4db6ba28ce504ec0aa869f3857a24f96a5ab66" gracePeriod=30 Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.602745 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vtln\" (UniqueName: \"kubernetes.io/projected/b00c0a9e-c371-43ee-a9c3-53875562acae-kube-api-access-6vtln\") pod \"watcher2ff6-account-delete-ss5kz\" (UID: \"b00c0a9e-c371-43ee-a9c3-53875562acae\") " pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.602890 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00c0a9e-c371-43ee-a9c3-53875562acae-operator-scripts\") pod \"watcher2ff6-account-delete-ss5kz\" (UID: \"b00c0a9e-c371-43ee-a9c3-53875562acae\") " pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.604346 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00c0a9e-c371-43ee-a9c3-53875562acae-operator-scripts\") pod \"watcher2ff6-account-delete-ss5kz\" (UID: \"b00c0a9e-c371-43ee-a9c3-53875562acae\") " pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.664225 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vtln\" (UniqueName: \"kubernetes.io/projected/b00c0a9e-c371-43ee-a9c3-53875562acae-kube-api-access-6vtln\") pod \"watcher2ff6-account-delete-ss5kz\" (UID: \"b00c0a9e-c371-43ee-a9c3-53875562acae\") " pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.700518 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.809141 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.873765 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604","Type":"ContainerDied","Data":"82026988f9bcf75fb3e277bc272bbc0deae5d91fc29f4ac64fb039174e39ee44"} Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.874034 4778 scope.go:117] "RemoveContainer" containerID="af5ad27761791f2b4a7e5e67001d35d92f2f607f9240e9d8ee3b7c8bbabefa51" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.874141 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.894414 4778 generic.go:334] "Generic (PLEG): container finished" podID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" containerID="dcb573b79dfd62158fb8b8022f0559878e8c84bc7636832b984ad3484b8db717" exitCode=143 Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.894462 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5de6b2a4-6604-409e-8bd2-c74eb85ae51e","Type":"ContainerDied","Data":"dcb573b79dfd62158fb8b8022f0559878e8c84bc7636832b984ad3484b8db717"} Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.909083 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-custom-prometheus-ca\") pod \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.909133 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-combined-ca-bundle\") pod \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.909177 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-logs\") pod \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.909222 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-config-data\") pod \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.909307 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbvvb\" (UniqueName: \"kubernetes.io/projected/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-kube-api-access-qbvvb\") pod \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\" (UID: \"bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604\") " Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.910425 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-logs" (OuterVolumeSpecName: "logs") pod "bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" (UID: "bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.915509 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-kube-api-access-qbvvb" (OuterVolumeSpecName: "kube-api-access-qbvvb") pod "bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" (UID: "bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604"). InnerVolumeSpecName "kube-api-access-qbvvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.937697 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" (UID: "bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.944030 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" (UID: "bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:45:59 crc kubenswrapper[4778]: I1205 16:45:59.970513 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-config-data" (OuterVolumeSpecName: "config-data") pod "bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" (UID: "bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.010576 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbvvb\" (UniqueName: \"kubernetes.io/projected/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-kube-api-access-qbvvb\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.010617 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.010629 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.010642 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.010655 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.067633 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher2ff6-account-delete-ss5kz"] Dec 05 16:46:00 crc kubenswrapper[4778]: W1205 16:46:00.080491 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb00c0a9e_c371_43ee_a9c3_53875562acae.slice/crio-798fec6080380d9208604bf993de27ebc8811d41fde66c147d0d9f9962e9a4a6 WatchSource:0}: Error finding container 798fec6080380d9208604bf993de27ebc8811d41fde66c147d0d9f9962e9a4a6: Status 404 returned error can't find the container with id 798fec6080380d9208604bf993de27ebc8811d41fde66c147d0d9f9962e9a4a6 Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.230645 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.239216 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:46:00 crc kubenswrapper[4778]: E1205 16:46:00.741066 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb00c0a9e_c371_43ee_a9c3_53875562acae.slice/crio-conmon-969d2a3ffbb83a9c155fd74710df0776a1476837232b7239b0dc4bf7f9a76ed9.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.903859 4778 generic.go:334] "Generic (PLEG): container finished" podID="b00c0a9e-c371-43ee-a9c3-53875562acae" containerID="969d2a3ffbb83a9c155fd74710df0776a1476837232b7239b0dc4bf7f9a76ed9" exitCode=0 Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.903912 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" event={"ID":"b00c0a9e-c371-43ee-a9c3-53875562acae","Type":"ContainerDied","Data":"969d2a3ffbb83a9c155fd74710df0776a1476837232b7239b0dc4bf7f9a76ed9"} Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.903935 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" event={"ID":"b00c0a9e-c371-43ee-a9c3-53875562acae","Type":"ContainerStarted","Data":"798fec6080380d9208604bf993de27ebc8811d41fde66c147d0d9f9962e9a4a6"} Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.905969 4778 generic.go:334] "Generic (PLEG): container finished" podID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" containerID="486e83d28cd72050c59cf79c3b4db6ba28ce504ec0aa869f3857a24f96a5ab66" exitCode=0 Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.906051 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5de6b2a4-6604-409e-8bd2-c74eb85ae51e","Type":"ContainerDied","Data":"486e83d28cd72050c59cf79c3b4db6ba28ce504ec0aa869f3857a24f96a5ab66"} Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.945574 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.178:9322/\": dial tcp 10.217.0.178:9322: connect: connection refused" Dec 05 16:46:00 crc kubenswrapper[4778]: I1205 16:46:00.945641 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9322/\": dial tcp 10.217.0.178:9322: connect: connection refused" Dec 05 16:46:00 crc kubenswrapper[4778]: E1205 16:46:00.994761 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:46:00 crc kubenswrapper[4778]: E1205 16:46:00.997788 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:46:00 crc kubenswrapper[4778]: E1205 16:46:00.998962 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:46:00 crc kubenswrapper[4778]: E1205 16:46:00.999265 4778 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="2e08d7c0-3b89-4afa-804d-e6f87469a631" containerName="watcher-applier" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.261085 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1695ca72-1c4c-496a-85cf-481567869c56" path="/var/lib/kubelet/pods/1695ca72-1c4c-496a-85cf-481567869c56/volumes" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.262220 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" path="/var/lib/kubelet/pods/bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604/volumes" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.368377 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.535523 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-combined-ca-bundle\") pod \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.535592 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l566\" (UniqueName: \"kubernetes.io/projected/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-kube-api-access-8l566\") pod \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.535619 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-config-data\") pod \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.535723 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-logs\") pod \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.535825 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-custom-prometheus-ca\") pod \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\" (UID: \"5de6b2a4-6604-409e-8bd2-c74eb85ae51e\") " Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.538571 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-logs" (OuterVolumeSpecName: "logs") pod "5de6b2a4-6604-409e-8bd2-c74eb85ae51e" (UID: "5de6b2a4-6604-409e-8bd2-c74eb85ae51e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.551654 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-kube-api-access-8l566" (OuterVolumeSpecName: "kube-api-access-8l566") pod "5de6b2a4-6604-409e-8bd2-c74eb85ae51e" (UID: "5de6b2a4-6604-409e-8bd2-c74eb85ae51e"). InnerVolumeSpecName "kube-api-access-8l566". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.583910 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "5de6b2a4-6604-409e-8bd2-c74eb85ae51e" (UID: "5de6b2a4-6604-409e-8bd2-c74eb85ae51e"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.587585 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5de6b2a4-6604-409e-8bd2-c74eb85ae51e" (UID: "5de6b2a4-6604-409e-8bd2-c74eb85ae51e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.603995 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-config-data" (OuterVolumeSpecName: "config-data") pod "5de6b2a4-6604-409e-8bd2-c74eb85ae51e" (UID: "5de6b2a4-6604-409e-8bd2-c74eb85ae51e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.638242 4778 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.638294 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.638307 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l566\" (UniqueName: \"kubernetes.io/projected/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-kube-api-access-8l566\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.638320 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.638335 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5de6b2a4-6604-409e-8bd2-c74eb85ae51e-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.915934 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.915920 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5de6b2a4-6604-409e-8bd2-c74eb85ae51e","Type":"ContainerDied","Data":"e86262e87ea9a6655ab9cf114b1733f91641e40e2cdfa2e57030006c0b78fc5c"} Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.916108 4778 scope.go:117] "RemoveContainer" containerID="486e83d28cd72050c59cf79c3b4db6ba28ce504ec0aa869f3857a24f96a5ab66" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.938985 4778 scope.go:117] "RemoveContainer" containerID="dcb573b79dfd62158fb8b8022f0559878e8c84bc7636832b984ad3484b8db717" Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.950304 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:46:01 crc kubenswrapper[4778]: I1205 16:46:01.960610 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:46:02 crc kubenswrapper[4778]: I1205 16:46:02.245491 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" Dec 05 16:46:02 crc kubenswrapper[4778]: I1205 16:46:02.347470 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00c0a9e-c371-43ee-a9c3-53875562acae-operator-scripts\") pod \"b00c0a9e-c371-43ee-a9c3-53875562acae\" (UID: \"b00c0a9e-c371-43ee-a9c3-53875562acae\") " Dec 05 16:46:02 crc kubenswrapper[4778]: I1205 16:46:02.347644 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vtln\" (UniqueName: \"kubernetes.io/projected/b00c0a9e-c371-43ee-a9c3-53875562acae-kube-api-access-6vtln\") pod \"b00c0a9e-c371-43ee-a9c3-53875562acae\" (UID: \"b00c0a9e-c371-43ee-a9c3-53875562acae\") " Dec 05 16:46:02 crc kubenswrapper[4778]: I1205 16:46:02.347993 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b00c0a9e-c371-43ee-a9c3-53875562acae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b00c0a9e-c371-43ee-a9c3-53875562acae" (UID: "b00c0a9e-c371-43ee-a9c3-53875562acae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:46:02 crc kubenswrapper[4778]: I1205 16:46:02.348244 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00c0a9e-c371-43ee-a9c3-53875562acae-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:02 crc kubenswrapper[4778]: I1205 16:46:02.359768 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b00c0a9e-c371-43ee-a9c3-53875562acae-kube-api-access-6vtln" (OuterVolumeSpecName: "kube-api-access-6vtln") pod "b00c0a9e-c371-43ee-a9c3-53875562acae" (UID: "b00c0a9e-c371-43ee-a9c3-53875562acae"). InnerVolumeSpecName "kube-api-access-6vtln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:46:02 crc kubenswrapper[4778]: I1205 16:46:02.449808 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vtln\" (UniqueName: \"kubernetes.io/projected/b00c0a9e-c371-43ee-a9c3-53875562acae-kube-api-access-6vtln\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:02 crc kubenswrapper[4778]: I1205 16:46:02.924529 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" event={"ID":"b00c0a9e-c371-43ee-a9c3-53875562acae","Type":"ContainerDied","Data":"798fec6080380d9208604bf993de27ebc8811d41fde66c147d0d9f9962e9a4a6"} Dec 05 16:46:02 crc kubenswrapper[4778]: I1205 16:46:02.924563 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="798fec6080380d9208604bf993de27ebc8811d41fde66c147d0d9f9962e9a4a6" Dec 05 16:46:02 crc kubenswrapper[4778]: I1205 16:46:02.924605 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2ff6-account-delete-ss5kz" Dec 05 16:46:03 crc kubenswrapper[4778]: I1205 16:46:03.260742 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" path="/var/lib/kubelet/pods/5de6b2a4-6604-409e-8bd2-c74eb85ae51e/volumes" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.249431 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:46:04 crc kubenswrapper[4778]: E1205 16:46:04.249989 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.406255 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-vk6n7"] Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.414259 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-vk6n7"] Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.429007 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2"] Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.443704 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher2ff6-account-delete-ss5kz"] Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.454649 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-2ff6-account-create-update-bmvf2"] Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.462284 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher2ff6-account-delete-ss5kz"] Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.587556 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-pghpr"] Dec 05 16:46:04 crc kubenswrapper[4778]: E1205 16:46:04.588046 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" containerName="watcher-api" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588069 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" containerName="watcher-api" Dec 05 16:46:04 crc kubenswrapper[4778]: E1205 16:46:04.588092 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588100 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: E1205 16:46:04.588116 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00c0a9e-c371-43ee-a9c3-53875562acae" containerName="mariadb-account-delete" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588124 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00c0a9e-c371-43ee-a9c3-53875562acae" containerName="mariadb-account-delete" Dec 05 16:46:04 crc kubenswrapper[4778]: E1205 16:46:04.588137 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588144 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: E1205 16:46:04.588156 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588166 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: E1205 16:46:04.588185 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" containerName="watcher-kuttl-api-log" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588193 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" containerName="watcher-kuttl-api-log" Dec 05 16:46:04 crc kubenswrapper[4778]: E1205 16:46:04.588208 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588216 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: E1205 16:46:04.588246 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588255 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: E1205 16:46:04.588264 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588271 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588520 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" containerName="watcher-kuttl-api-log" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588546 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588555 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588564 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588572 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00c0a9e-c371-43ee-a9c3-53875562acae" containerName="mariadb-account-delete" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588583 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588595 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.588610 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de6b2a4-6604-409e-8bd2-c74eb85ae51e" containerName="watcher-api" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.589347 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-pghpr" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.670424 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj"] Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.670956 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcda7cb7-7db4-4ae6-9d5b-0e67c1a36604" containerName="watcher-decision-engine" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.671574 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.675745 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.682214 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6kt6\" (UniqueName: \"kubernetes.io/projected/60883c3e-aae4-43bc-86d2-83b6801b6201-kube-api-access-l6kt6\") pod \"watcher-db-create-pghpr\" (UID: \"60883c3e-aae4-43bc-86d2-83b6801b6201\") " pod="watcher-kuttl-default/watcher-db-create-pghpr" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.682281 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60883c3e-aae4-43bc-86d2-83b6801b6201-operator-scripts\") pod \"watcher-db-create-pghpr\" (UID: \"60883c3e-aae4-43bc-86d2-83b6801b6201\") " pod="watcher-kuttl-default/watcher-db-create-pghpr" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.682694 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj"] Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.696737 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-pghpr"] Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.791467 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60883c3e-aae4-43bc-86d2-83b6801b6201-operator-scripts\") pod \"watcher-db-create-pghpr\" (UID: \"60883c3e-aae4-43bc-86d2-83b6801b6201\") " pod="watcher-kuttl-default/watcher-db-create-pghpr" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.791529 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d13bc57-d3d4-4278-b9a9-c90937db910e-operator-scripts\") pod \"watcher-3a5a-account-create-update-fljmj\" (UID: \"2d13bc57-d3d4-4278-b9a9-c90937db910e\") " pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.791599 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x66jf\" (UniqueName: \"kubernetes.io/projected/2d13bc57-d3d4-4278-b9a9-c90937db910e-kube-api-access-x66jf\") pod \"watcher-3a5a-account-create-update-fljmj\" (UID: \"2d13bc57-d3d4-4278-b9a9-c90937db910e\") " pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.791651 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6kt6\" (UniqueName: \"kubernetes.io/projected/60883c3e-aae4-43bc-86d2-83b6801b6201-kube-api-access-l6kt6\") pod \"watcher-db-create-pghpr\" (UID: \"60883c3e-aae4-43bc-86d2-83b6801b6201\") " pod="watcher-kuttl-default/watcher-db-create-pghpr" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.792474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60883c3e-aae4-43bc-86d2-83b6801b6201-operator-scripts\") pod \"watcher-db-create-pghpr\" (UID: \"60883c3e-aae4-43bc-86d2-83b6801b6201\") " pod="watcher-kuttl-default/watcher-db-create-pghpr" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.819054 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6kt6\" (UniqueName: \"kubernetes.io/projected/60883c3e-aae4-43bc-86d2-83b6801b6201-kube-api-access-l6kt6\") pod \"watcher-db-create-pghpr\" (UID: \"60883c3e-aae4-43bc-86d2-83b6801b6201\") " pod="watcher-kuttl-default/watcher-db-create-pghpr" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.893469 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x66jf\" (UniqueName: \"kubernetes.io/projected/2d13bc57-d3d4-4278-b9a9-c90937db910e-kube-api-access-x66jf\") pod \"watcher-3a5a-account-create-update-fljmj\" (UID: \"2d13bc57-d3d4-4278-b9a9-c90937db910e\") " pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.893826 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d13bc57-d3d4-4278-b9a9-c90937db910e-operator-scripts\") pod \"watcher-3a5a-account-create-update-fljmj\" (UID: \"2d13bc57-d3d4-4278-b9a9-c90937db910e\") " pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.894433 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d13bc57-d3d4-4278-b9a9-c90937db910e-operator-scripts\") pod \"watcher-3a5a-account-create-update-fljmj\" (UID: \"2d13bc57-d3d4-4278-b9a9-c90937db910e\") " pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.912293 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x66jf\" (UniqueName: \"kubernetes.io/projected/2d13bc57-d3d4-4278-b9a9-c90937db910e-kube-api-access-x66jf\") pod \"watcher-3a5a-account-create-update-fljmj\" (UID: \"2d13bc57-d3d4-4278-b9a9-c90937db910e\") " pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.979636 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-pghpr" Dec 05 16:46:04 crc kubenswrapper[4778]: I1205 16:46:04.998964 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" Dec 05 16:46:05 crc kubenswrapper[4778]: I1205 16:46:05.265852 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2093d628-dfe1-4bdf-bc15-f148fef55c4e" path="/var/lib/kubelet/pods/2093d628-dfe1-4bdf-bc15-f148fef55c4e/volumes" Dec 05 16:46:05 crc kubenswrapper[4778]: I1205 16:46:05.273602 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="802eb834-b4e6-4946-abbe-636096f213c7" path="/var/lib/kubelet/pods/802eb834-b4e6-4946-abbe-636096f213c7/volumes" Dec 05 16:46:05 crc kubenswrapper[4778]: I1205 16:46:05.274198 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b00c0a9e-c371-43ee-a9c3-53875562acae" path="/var/lib/kubelet/pods/b00c0a9e-c371-43ee-a9c3-53875562acae/volumes" Dec 05 16:46:05 crc kubenswrapper[4778]: I1205 16:46:05.340805 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-pghpr"] Dec 05 16:46:05 crc kubenswrapper[4778]: W1205 16:46:05.360589 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60883c3e_aae4_43bc_86d2_83b6801b6201.slice/crio-448571a35410c86d76b7549d6af72ab587171ad6df031dc9b523f9fb19cdc47f WatchSource:0}: Error finding container 448571a35410c86d76b7549d6af72ab587171ad6df031dc9b523f9fb19cdc47f: Status 404 returned error can't find the container with id 448571a35410c86d76b7549d6af72ab587171ad6df031dc9b523f9fb19cdc47f Dec 05 16:46:05 crc kubenswrapper[4778]: I1205 16:46:05.498888 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj"] Dec 05 16:46:05 crc kubenswrapper[4778]: I1205 16:46:05.947804 4778 generic.go:334] "Generic (PLEG): container finished" podID="2e08d7c0-3b89-4afa-804d-e6f87469a631" containerID="ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d" exitCode=0 Dec 05 16:46:05 crc kubenswrapper[4778]: I1205 16:46:05.947901 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"2e08d7c0-3b89-4afa-804d-e6f87469a631","Type":"ContainerDied","Data":"ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d"} Dec 05 16:46:05 crc kubenswrapper[4778]: I1205 16:46:05.950350 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" event={"ID":"2d13bc57-d3d4-4278-b9a9-c90937db910e","Type":"ContainerStarted","Data":"240b772f8a1bf29826fe897efb7e060863334fd0d2772db5dc1e63fa5c34c841"} Dec 05 16:46:05 crc kubenswrapper[4778]: I1205 16:46:05.951627 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-pghpr" event={"ID":"60883c3e-aae4-43bc-86d2-83b6801b6201","Type":"ContainerStarted","Data":"448571a35410c86d76b7549d6af72ab587171ad6df031dc9b523f9fb19cdc47f"} Dec 05 16:46:05 crc kubenswrapper[4778]: E1205 16:46:05.993128 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d is running failed: container process not found" containerID="ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:46:05 crc kubenswrapper[4778]: E1205 16:46:05.993654 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d is running failed: container process not found" containerID="ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:46:05 crc kubenswrapper[4778]: E1205 16:46:05.994050 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d is running failed: container process not found" containerID="ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 16:46:05 crc kubenswrapper[4778]: E1205 16:46:05.994115 4778 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d is running failed: container process not found" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="2e08d7c0-3b89-4afa-804d-e6f87469a631" containerName="watcher-applier" Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.523214 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.647089 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e08d7c0-3b89-4afa-804d-e6f87469a631-combined-ca-bundle\") pod \"2e08d7c0-3b89-4afa-804d-e6f87469a631\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.647197 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e08d7c0-3b89-4afa-804d-e6f87469a631-logs\") pod \"2e08d7c0-3b89-4afa-804d-e6f87469a631\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.647275 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w56v\" (UniqueName: \"kubernetes.io/projected/2e08d7c0-3b89-4afa-804d-e6f87469a631-kube-api-access-6w56v\") pod \"2e08d7c0-3b89-4afa-804d-e6f87469a631\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.647316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e08d7c0-3b89-4afa-804d-e6f87469a631-config-data\") pod \"2e08d7c0-3b89-4afa-804d-e6f87469a631\" (UID: \"2e08d7c0-3b89-4afa-804d-e6f87469a631\") " Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.647729 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e08d7c0-3b89-4afa-804d-e6f87469a631-logs" (OuterVolumeSpecName: "logs") pod "2e08d7c0-3b89-4afa-804d-e6f87469a631" (UID: "2e08d7c0-3b89-4afa-804d-e6f87469a631"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.652571 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e08d7c0-3b89-4afa-804d-e6f87469a631-kube-api-access-6w56v" (OuterVolumeSpecName: "kube-api-access-6w56v") pod "2e08d7c0-3b89-4afa-804d-e6f87469a631" (UID: "2e08d7c0-3b89-4afa-804d-e6f87469a631"). InnerVolumeSpecName "kube-api-access-6w56v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.669586 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e08d7c0-3b89-4afa-804d-e6f87469a631-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e08d7c0-3b89-4afa-804d-e6f87469a631" (UID: "2e08d7c0-3b89-4afa-804d-e6f87469a631"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.686584 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e08d7c0-3b89-4afa-804d-e6f87469a631-config-data" (OuterVolumeSpecName: "config-data") pod "2e08d7c0-3b89-4afa-804d-e6f87469a631" (UID: "2e08d7c0-3b89-4afa-804d-e6f87469a631"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.748938 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e08d7c0-3b89-4afa-804d-e6f87469a631-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.748972 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w56v\" (UniqueName: \"kubernetes.io/projected/2e08d7c0-3b89-4afa-804d-e6f87469a631-kube-api-access-6w56v\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.749009 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e08d7c0-3b89-4afa-804d-e6f87469a631-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.749020 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e08d7c0-3b89-4afa-804d-e6f87469a631-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.963822 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"2e08d7c0-3b89-4afa-804d-e6f87469a631","Type":"ContainerDied","Data":"b3519dc79fa6095c03c7ffca5d19ef8aa2492e0eda79e1fe4af3331f07f52063"} Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.963866 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.963888 4778 scope.go:117] "RemoveContainer" containerID="ac92c4f2bcbfd4650e719b155e0b72631f0d703ea16724fcbb59cc9d125ba63d" Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.967605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" event={"ID":"2d13bc57-d3d4-4278-b9a9-c90937db910e","Type":"ContainerStarted","Data":"adb5a97632ccfb50784f527b07b991250325245a40aecc1819c8fe4d216c3e55"} Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.969605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-pghpr" event={"ID":"60883c3e-aae4-43bc-86d2-83b6801b6201","Type":"ContainerStarted","Data":"a3e7eb7460c4450190fa92a0d359ff7a3f0dc1f787fb0d7f3d5fb00a19b2ad74"} Dec 05 16:46:06 crc kubenswrapper[4778]: I1205 16:46:06.986796 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" podStartSLOduration=2.986776658 podStartE2EDuration="2.986776658s" podCreationTimestamp="2025-12-05 16:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:46:06.982670227 +0000 UTC m=+3054.086466617" watchObservedRunningTime="2025-12-05 16:46:06.986776658 +0000 UTC m=+3054.090573038" Dec 05 16:46:07 crc kubenswrapper[4778]: I1205 16:46:07.002246 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db-create-pghpr" podStartSLOduration=3.002228674 podStartE2EDuration="3.002228674s" podCreationTimestamp="2025-12-05 16:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:46:06.99758236 +0000 UTC m=+3054.101378740" watchObservedRunningTime="2025-12-05 16:46:07.002228674 +0000 UTC m=+3054.106025054" Dec 05 16:46:07 crc kubenswrapper[4778]: I1205 16:46:07.019776 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:46:07 crc kubenswrapper[4778]: I1205 16:46:07.028804 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:46:07 crc kubenswrapper[4778]: I1205 16:46:07.260973 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e08d7c0-3b89-4afa-804d-e6f87469a631" path="/var/lib/kubelet/pods/2e08d7c0-3b89-4afa-804d-e6f87469a631/volumes" Dec 05 16:46:07 crc kubenswrapper[4778]: I1205 16:46:07.978756 4778 generic.go:334] "Generic (PLEG): container finished" podID="60883c3e-aae4-43bc-86d2-83b6801b6201" containerID="a3e7eb7460c4450190fa92a0d359ff7a3f0dc1f787fb0d7f3d5fb00a19b2ad74" exitCode=0 Dec 05 16:46:07 crc kubenswrapper[4778]: I1205 16:46:07.978862 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-pghpr" event={"ID":"60883c3e-aae4-43bc-86d2-83b6801b6201","Type":"ContainerDied","Data":"a3e7eb7460c4450190fa92a0d359ff7a3f0dc1f787fb0d7f3d5fb00a19b2ad74"} Dec 05 16:46:07 crc kubenswrapper[4778]: I1205 16:46:07.981965 4778 generic.go:334] "Generic (PLEG): container finished" podID="2d13bc57-d3d4-4278-b9a9-c90937db910e" containerID="adb5a97632ccfb50784f527b07b991250325245a40aecc1819c8fe4d216c3e55" exitCode=0 Dec 05 16:46:07 crc kubenswrapper[4778]: I1205 16:46:07.982032 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" event={"ID":"2d13bc57-d3d4-4278-b9a9-c90937db910e","Type":"ContainerDied","Data":"adb5a97632ccfb50784f527b07b991250325245a40aecc1819c8fe4d216c3e55"} Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.368790 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-pghpr" Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.376288 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.496734 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x66jf\" (UniqueName: \"kubernetes.io/projected/2d13bc57-d3d4-4278-b9a9-c90937db910e-kube-api-access-x66jf\") pod \"2d13bc57-d3d4-4278-b9a9-c90937db910e\" (UID: \"2d13bc57-d3d4-4278-b9a9-c90937db910e\") " Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.496783 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6kt6\" (UniqueName: \"kubernetes.io/projected/60883c3e-aae4-43bc-86d2-83b6801b6201-kube-api-access-l6kt6\") pod \"60883c3e-aae4-43bc-86d2-83b6801b6201\" (UID: \"60883c3e-aae4-43bc-86d2-83b6801b6201\") " Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.496855 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d13bc57-d3d4-4278-b9a9-c90937db910e-operator-scripts\") pod \"2d13bc57-d3d4-4278-b9a9-c90937db910e\" (UID: \"2d13bc57-d3d4-4278-b9a9-c90937db910e\") " Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.497381 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d13bc57-d3d4-4278-b9a9-c90937db910e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d13bc57-d3d4-4278-b9a9-c90937db910e" (UID: "2d13bc57-d3d4-4278-b9a9-c90937db910e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.498144 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60883c3e-aae4-43bc-86d2-83b6801b6201-operator-scripts\") pod \"60883c3e-aae4-43bc-86d2-83b6801b6201\" (UID: \"60883c3e-aae4-43bc-86d2-83b6801b6201\") " Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.498760 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d13bc57-d3d4-4278-b9a9-c90937db910e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.498902 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60883c3e-aae4-43bc-86d2-83b6801b6201-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60883c3e-aae4-43bc-86d2-83b6801b6201" (UID: "60883c3e-aae4-43bc-86d2-83b6801b6201"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.503304 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d13bc57-d3d4-4278-b9a9-c90937db910e-kube-api-access-x66jf" (OuterVolumeSpecName: "kube-api-access-x66jf") pod "2d13bc57-d3d4-4278-b9a9-c90937db910e" (UID: "2d13bc57-d3d4-4278-b9a9-c90937db910e"). InnerVolumeSpecName "kube-api-access-x66jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.503439 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60883c3e-aae4-43bc-86d2-83b6801b6201-kube-api-access-l6kt6" (OuterVolumeSpecName: "kube-api-access-l6kt6") pod "60883c3e-aae4-43bc-86d2-83b6801b6201" (UID: "60883c3e-aae4-43bc-86d2-83b6801b6201"). InnerVolumeSpecName "kube-api-access-l6kt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.599960 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x66jf\" (UniqueName: \"kubernetes.io/projected/2d13bc57-d3d4-4278-b9a9-c90937db910e-kube-api-access-x66jf\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.599995 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6kt6\" (UniqueName: \"kubernetes.io/projected/60883c3e-aae4-43bc-86d2-83b6801b6201-kube-api-access-l6kt6\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.600005 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60883c3e-aae4-43bc-86d2-83b6801b6201-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.999786 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" event={"ID":"2d13bc57-d3d4-4278-b9a9-c90937db910e","Type":"ContainerDied","Data":"240b772f8a1bf29826fe897efb7e060863334fd0d2772db5dc1e63fa5c34c841"} Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.999854 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="240b772f8a1bf29826fe897efb7e060863334fd0d2772db5dc1e63fa5c34c841" Dec 05 16:46:09 crc kubenswrapper[4778]: I1205 16:46:09.999817 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj" Dec 05 16:46:10 crc kubenswrapper[4778]: I1205 16:46:10.001521 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-pghpr" event={"ID":"60883c3e-aae4-43bc-86d2-83b6801b6201","Type":"ContainerDied","Data":"448571a35410c86d76b7549d6af72ab587171ad6df031dc9b523f9fb19cdc47f"} Dec 05 16:46:10 crc kubenswrapper[4778]: I1205 16:46:10.001553 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="448571a35410c86d76b7549d6af72ab587171ad6df031dc9b523f9fb19cdc47f" Dec 05 16:46:10 crc kubenswrapper[4778]: I1205 16:46:10.001565 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-pghpr" Dec 05 16:46:14 crc kubenswrapper[4778]: I1205 16:46:14.959857 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-jfd52"] Dec 05 16:46:14 crc kubenswrapper[4778]: E1205 16:46:14.960656 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e08d7c0-3b89-4afa-804d-e6f87469a631" containerName="watcher-applier" Dec 05 16:46:14 crc kubenswrapper[4778]: I1205 16:46:14.960670 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e08d7c0-3b89-4afa-804d-e6f87469a631" containerName="watcher-applier" Dec 05 16:46:14 crc kubenswrapper[4778]: E1205 16:46:14.960683 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d13bc57-d3d4-4278-b9a9-c90937db910e" containerName="mariadb-account-create-update" Dec 05 16:46:14 crc kubenswrapper[4778]: I1205 16:46:14.960689 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d13bc57-d3d4-4278-b9a9-c90937db910e" containerName="mariadb-account-create-update" Dec 05 16:46:14 crc kubenswrapper[4778]: E1205 16:46:14.960702 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60883c3e-aae4-43bc-86d2-83b6801b6201" containerName="mariadb-database-create" Dec 05 16:46:14 crc kubenswrapper[4778]: I1205 16:46:14.960773 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="60883c3e-aae4-43bc-86d2-83b6801b6201" containerName="mariadb-database-create" Dec 05 16:46:14 crc kubenswrapper[4778]: I1205 16:46:14.960935 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d13bc57-d3d4-4278-b9a9-c90937db910e" containerName="mariadb-account-create-update" Dec 05 16:46:14 crc kubenswrapper[4778]: I1205 16:46:14.960952 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e08d7c0-3b89-4afa-804d-e6f87469a631" containerName="watcher-applier" Dec 05 16:46:14 crc kubenswrapper[4778]: I1205 16:46:14.960965 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="60883c3e-aae4-43bc-86d2-83b6801b6201" containerName="mariadb-database-create" Dec 05 16:46:14 crc kubenswrapper[4778]: I1205 16:46:14.961530 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:14 crc kubenswrapper[4778]: I1205 16:46:14.968074 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-55msh" Dec 05 16:46:14 crc kubenswrapper[4778]: I1205 16:46:14.968074 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 16:46:14 crc kubenswrapper[4778]: I1205 16:46:14.970242 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-jfd52"] Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.090932 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-config-data\") pod \"watcher-kuttl-db-sync-jfd52\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.090977 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-jfd52\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.091007 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-db-sync-config-data\") pod \"watcher-kuttl-db-sync-jfd52\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.091221 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99qvm\" (UniqueName: \"kubernetes.io/projected/016cbd75-1b09-4c22-ad66-3f97406f16f7-kube-api-access-99qvm\") pod \"watcher-kuttl-db-sync-jfd52\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.192312 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-config-data\") pod \"watcher-kuttl-db-sync-jfd52\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.192354 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-jfd52\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.192403 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-db-sync-config-data\") pod \"watcher-kuttl-db-sync-jfd52\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.192451 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99qvm\" (UniqueName: \"kubernetes.io/projected/016cbd75-1b09-4c22-ad66-3f97406f16f7-kube-api-access-99qvm\") pod \"watcher-kuttl-db-sync-jfd52\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.197626 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-db-sync-config-data\") pod \"watcher-kuttl-db-sync-jfd52\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.198571 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-jfd52\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.198951 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-config-data\") pod \"watcher-kuttl-db-sync-jfd52\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.220852 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99qvm\" (UniqueName: \"kubernetes.io/projected/016cbd75-1b09-4c22-ad66-3f97406f16f7-kube-api-access-99qvm\") pod \"watcher-kuttl-db-sync-jfd52\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.280941 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:15 crc kubenswrapper[4778]: I1205 16:46:15.721936 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-jfd52"] Dec 05 16:46:16 crc kubenswrapper[4778]: I1205 16:46:16.049356 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" event={"ID":"016cbd75-1b09-4c22-ad66-3f97406f16f7","Type":"ContainerStarted","Data":"d1e63dd94bea9419c5ddb14850476f117c62b99f060b15aa49ef4c283dae89d3"} Dec 05 16:46:16 crc kubenswrapper[4778]: I1205 16:46:16.049741 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" event={"ID":"016cbd75-1b09-4c22-ad66-3f97406f16f7","Type":"ContainerStarted","Data":"7ede10c5c0c20605300bfd1885f0eb4956052742dbb05e103af4fde08bf5f6b1"} Dec 05 16:46:16 crc kubenswrapper[4778]: I1205 16:46:16.067909 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" podStartSLOduration=2.067889844 podStartE2EDuration="2.067889844s" podCreationTimestamp="2025-12-05 16:46:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:46:16.067030661 +0000 UTC m=+3063.170827051" watchObservedRunningTime="2025-12-05 16:46:16.067889844 +0000 UTC m=+3063.171686224" Dec 05 16:46:17 crc kubenswrapper[4778]: I1205 16:46:17.249028 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:46:17 crc kubenswrapper[4778]: E1205 16:46:17.249567 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:46:19 crc kubenswrapper[4778]: I1205 16:46:19.073332 4778 generic.go:334] "Generic (PLEG): container finished" podID="016cbd75-1b09-4c22-ad66-3f97406f16f7" containerID="d1e63dd94bea9419c5ddb14850476f117c62b99f060b15aa49ef4c283dae89d3" exitCode=0 Dec 05 16:46:19 crc kubenswrapper[4778]: I1205 16:46:19.073397 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" event={"ID":"016cbd75-1b09-4c22-ad66-3f97406f16f7","Type":"ContainerDied","Data":"d1e63dd94bea9419c5ddb14850476f117c62b99f060b15aa49ef4c283dae89d3"} Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.458070 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.575347 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99qvm\" (UniqueName: \"kubernetes.io/projected/016cbd75-1b09-4c22-ad66-3f97406f16f7-kube-api-access-99qvm\") pod \"016cbd75-1b09-4c22-ad66-3f97406f16f7\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.575460 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-combined-ca-bundle\") pod \"016cbd75-1b09-4c22-ad66-3f97406f16f7\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.575628 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-config-data\") pod \"016cbd75-1b09-4c22-ad66-3f97406f16f7\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.575655 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-db-sync-config-data\") pod \"016cbd75-1b09-4c22-ad66-3f97406f16f7\" (UID: \"016cbd75-1b09-4c22-ad66-3f97406f16f7\") " Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.581572 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/016cbd75-1b09-4c22-ad66-3f97406f16f7-kube-api-access-99qvm" (OuterVolumeSpecName: "kube-api-access-99qvm") pod "016cbd75-1b09-4c22-ad66-3f97406f16f7" (UID: "016cbd75-1b09-4c22-ad66-3f97406f16f7"). InnerVolumeSpecName "kube-api-access-99qvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.587118 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "016cbd75-1b09-4c22-ad66-3f97406f16f7" (UID: "016cbd75-1b09-4c22-ad66-3f97406f16f7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.603976 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "016cbd75-1b09-4c22-ad66-3f97406f16f7" (UID: "016cbd75-1b09-4c22-ad66-3f97406f16f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.620765 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-config-data" (OuterVolumeSpecName: "config-data") pod "016cbd75-1b09-4c22-ad66-3f97406f16f7" (UID: "016cbd75-1b09-4c22-ad66-3f97406f16f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.677651 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.677694 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.677704 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99qvm\" (UniqueName: \"kubernetes.io/projected/016cbd75-1b09-4c22-ad66-3f97406f16f7-kube-api-access-99qvm\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:20 crc kubenswrapper[4778]: I1205 16:46:20.677717 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016cbd75-1b09-4c22-ad66-3f97406f16f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.095199 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" event={"ID":"016cbd75-1b09-4c22-ad66-3f97406f16f7","Type":"ContainerDied","Data":"7ede10c5c0c20605300bfd1885f0eb4956052742dbb05e103af4fde08bf5f6b1"} Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.095253 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ede10c5c0c20605300bfd1885f0eb4956052742dbb05e103af4fde08bf5f6b1" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.095221 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-jfd52" Dec 05 16:46:21 crc kubenswrapper[4778]: E1205 16:46:21.186430 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod016cbd75_1b09_4c22_ad66_3f97406f16f7.slice\": RecentStats: unable to find data in memory cache]" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.397548 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:46:21 crc kubenswrapper[4778]: E1205 16:46:21.398313 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016cbd75-1b09-4c22-ad66-3f97406f16f7" containerName="watcher-kuttl-db-sync" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.398337 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="016cbd75-1b09-4c22-ad66-3f97406f16f7" containerName="watcher-kuttl-db-sync" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.398565 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="016cbd75-1b09-4c22-ad66-3f97406f16f7" containerName="watcher-kuttl-db-sync" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.399734 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.403589 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-55msh" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.403686 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.419057 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.420556 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.422597 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.441407 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.466195 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.488576 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw7jb\" (UniqueName: \"kubernetes.io/projected/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-kube-api-access-dw7jb\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.488646 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xssqn\" (UniqueName: \"kubernetes.io/projected/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-kube-api-access-xssqn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.488714 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.488735 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.488753 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.488783 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.488828 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-logs\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.488863 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.488898 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.488917 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.498932 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.500246 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.503690 4778 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.507165 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.589776 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.589994 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.590097 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.590262 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-logs\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.590332 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w86lc\" (UniqueName: \"kubernetes.io/projected/819d647d-6170-4b57-a849-7d686ddf2d65-kube-api-access-w86lc\") pod \"watcher-kuttl-applier-0\" (UID: \"819d647d-6170-4b57-a849-7d686ddf2d65\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.590476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/819d647d-6170-4b57-a849-7d686ddf2d65-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"819d647d-6170-4b57-a849-7d686ddf2d65\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.590545 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.590623 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819d647d-6170-4b57-a849-7d686ddf2d65-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"819d647d-6170-4b57-a849-7d686ddf2d65\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.590733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-logs\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.590497 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.590859 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.590964 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.591088 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw7jb\" (UniqueName: \"kubernetes.io/projected/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-kube-api-access-dw7jb\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.591255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xssqn\" (UniqueName: \"kubernetes.io/projected/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-kube-api-access-xssqn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.591488 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/819d647d-6170-4b57-a849-7d686ddf2d65-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"819d647d-6170-4b57-a849-7d686ddf2d65\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.591679 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.595010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.595581 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.605171 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.606852 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.607819 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.609004 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xssqn\" (UniqueName: \"kubernetes.io/projected/eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b-kube-api-access-xssqn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.614119 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw7jb\" (UniqueName: \"kubernetes.io/projected/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-kube-api-access-dw7jb\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.614891 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.693277 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w86lc\" (UniqueName: \"kubernetes.io/projected/819d647d-6170-4b57-a849-7d686ddf2d65-kube-api-access-w86lc\") pod \"watcher-kuttl-applier-0\" (UID: \"819d647d-6170-4b57-a849-7d686ddf2d65\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.693355 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/819d647d-6170-4b57-a849-7d686ddf2d65-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"819d647d-6170-4b57-a849-7d686ddf2d65\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.693393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819d647d-6170-4b57-a849-7d686ddf2d65-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"819d647d-6170-4b57-a849-7d686ddf2d65\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.693464 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/819d647d-6170-4b57-a849-7d686ddf2d65-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"819d647d-6170-4b57-a849-7d686ddf2d65\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.694269 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/819d647d-6170-4b57-a849-7d686ddf2d65-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"819d647d-6170-4b57-a849-7d686ddf2d65\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.697133 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/819d647d-6170-4b57-a849-7d686ddf2d65-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"819d647d-6170-4b57-a849-7d686ddf2d65\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.697989 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819d647d-6170-4b57-a849-7d686ddf2d65-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"819d647d-6170-4b57-a849-7d686ddf2d65\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.719030 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w86lc\" (UniqueName: \"kubernetes.io/projected/819d647d-6170-4b57-a849-7d686ddf2d65-kube-api-access-w86lc\") pod \"watcher-kuttl-applier-0\" (UID: \"819d647d-6170-4b57-a849-7d686ddf2d65\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.719510 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.743080 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:21 crc kubenswrapper[4778]: I1205 16:46:21.824769 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:22 crc kubenswrapper[4778]: I1205 16:46:22.215916 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 16:46:22 crc kubenswrapper[4778]: I1205 16:46:22.287181 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 16:46:22 crc kubenswrapper[4778]: W1205 16:46:22.290412 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaaac528_e568_4bd8_a7e9_eebdcbdc4b7b.slice/crio-1003372e5bd7cdf06ec9e1526bf924be77e17eb7cc90443c143cf386110d39d6 WatchSource:0}: Error finding container 1003372e5bd7cdf06ec9e1526bf924be77e17eb7cc90443c143cf386110d39d6: Status 404 returned error can't find the container with id 1003372e5bd7cdf06ec9e1526bf924be77e17eb7cc90443c143cf386110d39d6 Dec 05 16:46:22 crc kubenswrapper[4778]: W1205 16:46:22.322727 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod819d647d_6170_4b57_a849_7d686ddf2d65.slice/crio-9ac4c4d1eafe9620cf819544b81ef33126474da3440c1b4bd148cb5cd6aaae00 WatchSource:0}: Error finding container 9ac4c4d1eafe9620cf819544b81ef33126474da3440c1b4bd148cb5cd6aaae00: Status 404 returned error can't find the container with id 9ac4c4d1eafe9620cf819544b81ef33126474da3440c1b4bd148cb5cd6aaae00 Dec 05 16:46:22 crc kubenswrapper[4778]: I1205 16:46:22.325343 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.127107 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4","Type":"ContainerStarted","Data":"9a8d19048e59a003505b6b73d61de38c872ec540235b9263b2514ca75b761778"} Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.127389 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4","Type":"ContainerStarted","Data":"1520b97ddb76c8f193788c67d5e533d5531080fb564a8ff88a7706a06a663b4c"} Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.127528 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.127562 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4","Type":"ContainerStarted","Data":"cec7d17491818dd694df589a7a4e688b64beb047ab2d017b0e6fccf2984b4737"} Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.129294 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"819d647d-6170-4b57-a849-7d686ddf2d65","Type":"ContainerStarted","Data":"130aa38f676b90a3dafcee7fe51c2e180795c60c5a62c58c9b63fdbdd90b807c"} Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.129326 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"819d647d-6170-4b57-a849-7d686ddf2d65","Type":"ContainerStarted","Data":"9ac4c4d1eafe9620cf819544b81ef33126474da3440c1b4bd148cb5cd6aaae00"} Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.133028 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerStarted","Data":"ae72cc2d1649dd8f8774761185a3a3cc2c21d5facf7013777f9c08567a2e4be0"} Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.133060 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerStarted","Data":"1003372e5bd7cdf06ec9e1526bf924be77e17eb7cc90443c143cf386110d39d6"} Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.149662 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.149623301 podStartE2EDuration="2.149623301s" podCreationTimestamp="2025-12-05 16:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:46:23.148085 +0000 UTC m=+3070.251881390" watchObservedRunningTime="2025-12-05 16:46:23.149623301 +0000 UTC m=+3070.253419681" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.178510 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.17847601 podStartE2EDuration="2.17847601s" podCreationTimestamp="2025-12-05 16:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:46:23.171984914 +0000 UTC m=+3070.275781294" watchObservedRunningTime="2025-12-05 16:46:23.17847601 +0000 UTC m=+3070.282272390" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.205199 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.205157658 podStartE2EDuration="2.205157658s" podCreationTimestamp="2025-12-05 16:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:46:23.190273847 +0000 UTC m=+3070.294070238" watchObservedRunningTime="2025-12-05 16:46:23.205157658 +0000 UTC m=+3070.308954048" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.729355 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j2wv9"] Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.733112 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.738893 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2wv9"] Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.842709 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-catalog-content\") pod \"redhat-marketplace-j2wv9\" (UID: \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\") " pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.842758 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ns9c\" (UniqueName: \"kubernetes.io/projected/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-kube-api-access-5ns9c\") pod \"redhat-marketplace-j2wv9\" (UID: \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\") " pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.842993 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-utilities\") pod \"redhat-marketplace-j2wv9\" (UID: \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\") " pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.944381 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-catalog-content\") pod \"redhat-marketplace-j2wv9\" (UID: \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\") " pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.944434 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ns9c\" (UniqueName: \"kubernetes.io/projected/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-kube-api-access-5ns9c\") pod \"redhat-marketplace-j2wv9\" (UID: \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\") " pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.944529 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-utilities\") pod \"redhat-marketplace-j2wv9\" (UID: \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\") " pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.944953 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-catalog-content\") pod \"redhat-marketplace-j2wv9\" (UID: \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\") " pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.945043 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-utilities\") pod \"redhat-marketplace-j2wv9\" (UID: \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\") " pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:23 crc kubenswrapper[4778]: I1205 16:46:23.967669 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ns9c\" (UniqueName: \"kubernetes.io/projected/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-kube-api-access-5ns9c\") pod \"redhat-marketplace-j2wv9\" (UID: \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\") " pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:24 crc kubenswrapper[4778]: I1205 16:46:24.057224 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:24 crc kubenswrapper[4778]: I1205 16:46:24.598689 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2wv9"] Dec 05 16:46:25 crc kubenswrapper[4778]: I1205 16:46:25.152202 4778 generic.go:334] "Generic (PLEG): container finished" podID="7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" containerID="e1eb3ccde7cfea1754c885f354add5107663d0339e92f4a467aa138d3c558a67" exitCode=0 Dec 05 16:46:25 crc kubenswrapper[4778]: I1205 16:46:25.152287 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2wv9" event={"ID":"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06","Type":"ContainerDied","Data":"e1eb3ccde7cfea1754c885f354add5107663d0339e92f4a467aa138d3c558a67"} Dec 05 16:46:25 crc kubenswrapper[4778]: I1205 16:46:25.152444 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2wv9" event={"ID":"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06","Type":"ContainerStarted","Data":"24a8056bf43a6ed027c7fa2ec27c60ca4d9a2c5c403ee5edc17bcdd8770d43db"} Dec 05 16:46:25 crc kubenswrapper[4778]: I1205 16:46:25.568132 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:26 crc kubenswrapper[4778]: I1205 16:46:26.161398 4778 generic.go:334] "Generic (PLEG): container finished" podID="7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" containerID="1bfb7276446a0beb249451505e20f076a83167fef69464ab7154d9b02fe146ff" exitCode=0 Dec 05 16:46:26 crc kubenswrapper[4778]: I1205 16:46:26.161471 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2wv9" event={"ID":"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06","Type":"ContainerDied","Data":"1bfb7276446a0beb249451505e20f076a83167fef69464ab7154d9b02fe146ff"} Dec 05 16:46:26 crc kubenswrapper[4778]: I1205 16:46:26.163055 4778 generic.go:334] "Generic (PLEG): container finished" podID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" containerID="ae72cc2d1649dd8f8774761185a3a3cc2c21d5facf7013777f9c08567a2e4be0" exitCode=1 Dec 05 16:46:26 crc kubenswrapper[4778]: I1205 16:46:26.163094 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerDied","Data":"ae72cc2d1649dd8f8774761185a3a3cc2c21d5facf7013777f9c08567a2e4be0"} Dec 05 16:46:26 crc kubenswrapper[4778]: I1205 16:46:26.163389 4778 scope.go:117] "RemoveContainer" containerID="ae72cc2d1649dd8f8774761185a3a3cc2c21d5facf7013777f9c08567a2e4be0" Dec 05 16:46:26 crc kubenswrapper[4778]: I1205 16:46:26.719848 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:26 crc kubenswrapper[4778]: I1205 16:46:26.825605 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:27 crc kubenswrapper[4778]: I1205 16:46:27.177893 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerStarted","Data":"d81039bcb3015dfb34076a8bdda4e4bd7a9411abe04d38c2c5b9a1af26d373b9"} Dec 05 16:46:27 crc kubenswrapper[4778]: I1205 16:46:27.182986 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2wv9" event={"ID":"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06","Type":"ContainerStarted","Data":"727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247"} Dec 05 16:46:27 crc kubenswrapper[4778]: I1205 16:46:27.221292 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j2wv9" podStartSLOduration=2.80669944 podStartE2EDuration="4.221275714s" podCreationTimestamp="2025-12-05 16:46:23 +0000 UTC" firstStartedPulling="2025-12-05 16:46:25.153969154 +0000 UTC m=+3072.257765534" lastFinishedPulling="2025-12-05 16:46:26.568545428 +0000 UTC m=+3073.672341808" observedRunningTime="2025-12-05 16:46:27.216097714 +0000 UTC m=+3074.319894104" watchObservedRunningTime="2025-12-05 16:46:27.221275714 +0000 UTC m=+3074.325072094" Dec 05 16:46:29 crc kubenswrapper[4778]: I1205 16:46:29.199730 4778 generic.go:334] "Generic (PLEG): container finished" podID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" containerID="d81039bcb3015dfb34076a8bdda4e4bd7a9411abe04d38c2c5b9a1af26d373b9" exitCode=1 Dec 05 16:46:29 crc kubenswrapper[4778]: I1205 16:46:29.199801 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerDied","Data":"d81039bcb3015dfb34076a8bdda4e4bd7a9411abe04d38c2c5b9a1af26d373b9"} Dec 05 16:46:29 crc kubenswrapper[4778]: I1205 16:46:29.200068 4778 scope.go:117] "RemoveContainer" containerID="ae72cc2d1649dd8f8774761185a3a3cc2c21d5facf7013777f9c08567a2e4be0" Dec 05 16:46:29 crc kubenswrapper[4778]: I1205 16:46:29.200669 4778 scope.go:117] "RemoveContainer" containerID="d81039bcb3015dfb34076a8bdda4e4bd7a9411abe04d38c2c5b9a1af26d373b9" Dec 05 16:46:29 crc kubenswrapper[4778]: E1205 16:46:29.200948 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:46:29 crc kubenswrapper[4778]: I1205 16:46:29.252703 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:46:29 crc kubenswrapper[4778]: E1205 16:46:29.252971 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:46:31 crc kubenswrapper[4778]: I1205 16:46:31.720485 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:31 crc kubenswrapper[4778]: I1205 16:46:31.724944 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:31 crc kubenswrapper[4778]: I1205 16:46:31.744662 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:31 crc kubenswrapper[4778]: I1205 16:46:31.745076 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:31 crc kubenswrapper[4778]: I1205 16:46:31.745897 4778 scope.go:117] "RemoveContainer" containerID="d81039bcb3015dfb34076a8bdda4e4bd7a9411abe04d38c2c5b9a1af26d373b9" Dec 05 16:46:31 crc kubenswrapper[4778]: E1205 16:46:31.746232 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:46:31 crc kubenswrapper[4778]: I1205 16:46:31.826568 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:31 crc kubenswrapper[4778]: I1205 16:46:31.851529 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:32 crc kubenswrapper[4778]: I1205 16:46:32.235855 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 16:46:32 crc kubenswrapper[4778]: I1205 16:46:32.275319 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 16:46:34 crc kubenswrapper[4778]: I1205 16:46:34.060480 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:34 crc kubenswrapper[4778]: I1205 16:46:34.060547 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:34 crc kubenswrapper[4778]: I1205 16:46:34.117428 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:34 crc kubenswrapper[4778]: I1205 16:46:34.286312 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:37 crc kubenswrapper[4778]: I1205 16:46:37.711540 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2wv9"] Dec 05 16:46:37 crc kubenswrapper[4778]: I1205 16:46:37.712082 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j2wv9" podUID="7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" containerName="registry-server" containerID="cri-o://727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247" gracePeriod=2 Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.206112 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.274455 4778 generic.go:334] "Generic (PLEG): container finished" podID="7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" containerID="727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247" exitCode=0 Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.274500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2wv9" event={"ID":"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06","Type":"ContainerDied","Data":"727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247"} Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.274525 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2wv9" event={"ID":"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06","Type":"ContainerDied","Data":"24a8056bf43a6ed027c7fa2ec27c60ca4d9a2c5c403ee5edc17bcdd8770d43db"} Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.274522 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2wv9" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.274539 4778 scope.go:117] "RemoveContainer" containerID="727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.293396 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-utilities\") pod \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\" (UID: \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\") " Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.293565 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-catalog-content\") pod \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\" (UID: \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\") " Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.293595 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ns9c\" (UniqueName: \"kubernetes.io/projected/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-kube-api-access-5ns9c\") pod \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\" (UID: \"7a9ebd54-1db7-407a-a41e-d01d9c6ddf06\") " Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.295527 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-utilities" (OuterVolumeSpecName: "utilities") pod "7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" (UID: "7a9ebd54-1db7-407a-a41e-d01d9c6ddf06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.306692 4778 scope.go:117] "RemoveContainer" containerID="1bfb7276446a0beb249451505e20f076a83167fef69464ab7154d9b02fe146ff" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.307514 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-kube-api-access-5ns9c" (OuterVolumeSpecName: "kube-api-access-5ns9c") pod "7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" (UID: "7a9ebd54-1db7-407a-a41e-d01d9c6ddf06"). InnerVolumeSpecName "kube-api-access-5ns9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.318226 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" (UID: "7a9ebd54-1db7-407a-a41e-d01d9c6ddf06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.355105 4778 scope.go:117] "RemoveContainer" containerID="e1eb3ccde7cfea1754c885f354add5107663d0339e92f4a467aa138d3c558a67" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.382766 4778 scope.go:117] "RemoveContainer" containerID="727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247" Dec 05 16:46:38 crc kubenswrapper[4778]: E1205 16:46:38.383242 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247\": container with ID starting with 727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247 not found: ID does not exist" containerID="727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.383275 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247"} err="failed to get container status \"727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247\": rpc error: code = NotFound desc = could not find container \"727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247\": container with ID starting with 727d05b36bbdb59361107995c3112dad74222354bc067ef54676a631c8ed1247 not found: ID does not exist" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.383300 4778 scope.go:117] "RemoveContainer" containerID="1bfb7276446a0beb249451505e20f076a83167fef69464ab7154d9b02fe146ff" Dec 05 16:46:38 crc kubenswrapper[4778]: E1205 16:46:38.383852 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bfb7276446a0beb249451505e20f076a83167fef69464ab7154d9b02fe146ff\": container with ID starting with 1bfb7276446a0beb249451505e20f076a83167fef69464ab7154d9b02fe146ff not found: ID does not exist" containerID="1bfb7276446a0beb249451505e20f076a83167fef69464ab7154d9b02fe146ff" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.383911 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bfb7276446a0beb249451505e20f076a83167fef69464ab7154d9b02fe146ff"} err="failed to get container status \"1bfb7276446a0beb249451505e20f076a83167fef69464ab7154d9b02fe146ff\": rpc error: code = NotFound desc = could not find container \"1bfb7276446a0beb249451505e20f076a83167fef69464ab7154d9b02fe146ff\": container with ID starting with 1bfb7276446a0beb249451505e20f076a83167fef69464ab7154d9b02fe146ff not found: ID does not exist" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.383939 4778 scope.go:117] "RemoveContainer" containerID="e1eb3ccde7cfea1754c885f354add5107663d0339e92f4a467aa138d3c558a67" Dec 05 16:46:38 crc kubenswrapper[4778]: E1205 16:46:38.384252 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1eb3ccde7cfea1754c885f354add5107663d0339e92f4a467aa138d3c558a67\": container with ID starting with e1eb3ccde7cfea1754c885f354add5107663d0339e92f4a467aa138d3c558a67 not found: ID does not exist" containerID="e1eb3ccde7cfea1754c885f354add5107663d0339e92f4a467aa138d3c558a67" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.384277 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1eb3ccde7cfea1754c885f354add5107663d0339e92f4a467aa138d3c558a67"} err="failed to get container status \"e1eb3ccde7cfea1754c885f354add5107663d0339e92f4a467aa138d3c558a67\": rpc error: code = NotFound desc = could not find container \"e1eb3ccde7cfea1754c885f354add5107663d0339e92f4a467aa138d3c558a67\": container with ID starting with e1eb3ccde7cfea1754c885f354add5107663d0339e92f4a467aa138d3c558a67 not found: ID does not exist" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.395550 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.395598 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.395613 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ns9c\" (UniqueName: \"kubernetes.io/projected/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06-kube-api-access-5ns9c\") on node \"crc\" DevicePath \"\"" Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.610123 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2wv9"] Dec 05 16:46:38 crc kubenswrapper[4778]: I1205 16:46:38.616792 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2wv9"] Dec 05 16:46:39 crc kubenswrapper[4778]: I1205 16:46:39.261398 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" path="/var/lib/kubelet/pods/7a9ebd54-1db7-407a-a41e-d01d9c6ddf06/volumes" Dec 05 16:46:41 crc kubenswrapper[4778]: I1205 16:46:41.249832 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:46:41 crc kubenswrapper[4778]: E1205 16:46:41.251176 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:46:47 crc kubenswrapper[4778]: I1205 16:46:47.249624 4778 scope.go:117] "RemoveContainer" containerID="d81039bcb3015dfb34076a8bdda4e4bd7a9411abe04d38c2c5b9a1af26d373b9" Dec 05 16:46:48 crc kubenswrapper[4778]: I1205 16:46:48.353342 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerStarted","Data":"3d370072c292ce2d4288c16a79184a396ce7f9ae962287fc80e99b0cf62b3358"} Dec 05 16:46:50 crc kubenswrapper[4778]: I1205 16:46:50.386523 4778 generic.go:334] "Generic (PLEG): container finished" podID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" containerID="3d370072c292ce2d4288c16a79184a396ce7f9ae962287fc80e99b0cf62b3358" exitCode=1 Dec 05 16:46:50 crc kubenswrapper[4778]: I1205 16:46:50.386569 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerDied","Data":"3d370072c292ce2d4288c16a79184a396ce7f9ae962287fc80e99b0cf62b3358"} Dec 05 16:46:50 crc kubenswrapper[4778]: I1205 16:46:50.386923 4778 scope.go:117] "RemoveContainer" containerID="d81039bcb3015dfb34076a8bdda4e4bd7a9411abe04d38c2c5b9a1af26d373b9" Dec 05 16:46:50 crc kubenswrapper[4778]: I1205 16:46:50.388953 4778 scope.go:117] "RemoveContainer" containerID="3d370072c292ce2d4288c16a79184a396ce7f9ae962287fc80e99b0cf62b3358" Dec 05 16:46:50 crc kubenswrapper[4778]: E1205 16:46:50.389795 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:46:51 crc kubenswrapper[4778]: I1205 16:46:51.743678 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:51 crc kubenswrapper[4778]: I1205 16:46:51.743736 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:51 crc kubenswrapper[4778]: I1205 16:46:51.743751 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:51 crc kubenswrapper[4778]: I1205 16:46:51.743764 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:46:51 crc kubenswrapper[4778]: I1205 16:46:51.744420 4778 scope.go:117] "RemoveContainer" containerID="3d370072c292ce2d4288c16a79184a396ce7f9ae962287fc80e99b0cf62b3358" Dec 05 16:46:51 crc kubenswrapper[4778]: E1205 16:46:51.744751 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:46:52 crc kubenswrapper[4778]: I1205 16:46:52.249871 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:46:52 crc kubenswrapper[4778]: E1205 16:46:52.250563 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:47:04 crc kubenswrapper[4778]: I1205 16:47:04.249858 4778 scope.go:117] "RemoveContainer" containerID="3d370072c292ce2d4288c16a79184a396ce7f9ae962287fc80e99b0cf62b3358" Dec 05 16:47:04 crc kubenswrapper[4778]: I1205 16:47:04.250479 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:47:04 crc kubenswrapper[4778]: E1205 16:47:04.250704 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:47:04 crc kubenswrapper[4778]: E1205 16:47:04.250716 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:47:15 crc kubenswrapper[4778]: I1205 16:47:15.251784 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:47:15 crc kubenswrapper[4778]: E1205 16:47:15.253736 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:47:18 crc kubenswrapper[4778]: I1205 16:47:18.249628 4778 scope.go:117] "RemoveContainer" containerID="3d370072c292ce2d4288c16a79184a396ce7f9ae962287fc80e99b0cf62b3358" Dec 05 16:47:18 crc kubenswrapper[4778]: I1205 16:47:18.623183 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerStarted","Data":"d0070438db29785f755542ccd4e36d811e2fb5c8990d8aa5a5cfc1e08eff3822"} Dec 05 16:47:21 crc kubenswrapper[4778]: I1205 16:47:21.649750 4778 generic.go:334] "Generic (PLEG): container finished" podID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" containerID="d0070438db29785f755542ccd4e36d811e2fb5c8990d8aa5a5cfc1e08eff3822" exitCode=1 Dec 05 16:47:21 crc kubenswrapper[4778]: I1205 16:47:21.649876 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerDied","Data":"d0070438db29785f755542ccd4e36d811e2fb5c8990d8aa5a5cfc1e08eff3822"} Dec 05 16:47:21 crc kubenswrapper[4778]: I1205 16:47:21.650131 4778 scope.go:117] "RemoveContainer" containerID="3d370072c292ce2d4288c16a79184a396ce7f9ae962287fc80e99b0cf62b3358" Dec 05 16:47:21 crc kubenswrapper[4778]: I1205 16:47:21.650887 4778 scope.go:117] "RemoveContainer" containerID="d0070438db29785f755542ccd4e36d811e2fb5c8990d8aa5a5cfc1e08eff3822" Dec 05 16:47:21 crc kubenswrapper[4778]: E1205 16:47:21.651225 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:47:21 crc kubenswrapper[4778]: I1205 16:47:21.744346 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:47:21 crc kubenswrapper[4778]: I1205 16:47:21.745268 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:47:21 crc kubenswrapper[4778]: I1205 16:47:21.745433 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:47:21 crc kubenswrapper[4778]: I1205 16:47:21.745529 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:47:22 crc kubenswrapper[4778]: I1205 16:47:22.663141 4778 scope.go:117] "RemoveContainer" containerID="d0070438db29785f755542ccd4e36d811e2fb5c8990d8aa5a5cfc1e08eff3822" Dec 05 16:47:22 crc kubenswrapper[4778]: E1205 16:47:22.663394 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:47:23 crc kubenswrapper[4778]: I1205 16:47:23.670266 4778 scope.go:117] "RemoveContainer" containerID="d0070438db29785f755542ccd4e36d811e2fb5c8990d8aa5a5cfc1e08eff3822" Dec 05 16:47:23 crc kubenswrapper[4778]: E1205 16:47:23.670502 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:47:30 crc kubenswrapper[4778]: I1205 16:47:30.250292 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:47:30 crc kubenswrapper[4778]: E1205 16:47:30.251308 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:47:35 crc kubenswrapper[4778]: I1205 16:47:35.249036 4778 scope.go:117] "RemoveContainer" containerID="d0070438db29785f755542ccd4e36d811e2fb5c8990d8aa5a5cfc1e08eff3822" Dec 05 16:47:35 crc kubenswrapper[4778]: E1205 16:47:35.249786 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:47:44 crc kubenswrapper[4778]: I1205 16:47:44.249152 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:47:44 crc kubenswrapper[4778]: E1205 16:47:44.249983 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:47:46 crc kubenswrapper[4778]: I1205 16:47:46.250105 4778 scope.go:117] "RemoveContainer" containerID="d0070438db29785f755542ccd4e36d811e2fb5c8990d8aa5a5cfc1e08eff3822" Dec 05 16:47:46 crc kubenswrapper[4778]: E1205 16:47:46.251122 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:47:49 crc kubenswrapper[4778]: I1205 16:47:49.189748 4778 scope.go:117] "RemoveContainer" containerID="e16b064014592c74a716afbbbe8b0ead37d7ab5ec1cd747f13099642d5ee10ca" Dec 05 16:47:49 crc kubenswrapper[4778]: I1205 16:47:49.218996 4778 scope.go:117] "RemoveContainer" containerID="ee2d541ed224a461dd465fa6b39b67f80bfc149b2cbdd4c5e99241b3d27db58c" Dec 05 16:47:49 crc kubenswrapper[4778]: I1205 16:47:49.260988 4778 scope.go:117] "RemoveContainer" containerID="e33e7bc8c843e95a0c050e337c950e3e6bb8ef545948afbcc7484cb838f3b56b" Dec 05 16:47:49 crc kubenswrapper[4778]: I1205 16:47:49.284357 4778 scope.go:117] "RemoveContainer" containerID="f1c305ae03ed4d88c11880f3e74b788b6cc7a1dabee53c513fd1c913b3b11919" Dec 05 16:47:57 crc kubenswrapper[4778]: I1205 16:47:57.249305 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:47:57 crc kubenswrapper[4778]: E1205 16:47:57.250066 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:48:01 crc kubenswrapper[4778]: I1205 16:48:01.249986 4778 scope.go:117] "RemoveContainer" containerID="d0070438db29785f755542ccd4e36d811e2fb5c8990d8aa5a5cfc1e08eff3822" Dec 05 16:48:01 crc kubenswrapper[4778]: E1205 16:48:01.251044 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:48:11 crc kubenswrapper[4778]: I1205 16:48:11.249605 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:48:11 crc kubenswrapper[4778]: E1205 16:48:11.250783 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:48:16 crc kubenswrapper[4778]: I1205 16:48:16.249094 4778 scope.go:117] "RemoveContainer" containerID="d0070438db29785f755542ccd4e36d811e2fb5c8990d8aa5a5cfc1e08eff3822" Dec 05 16:48:17 crc kubenswrapper[4778]: I1205 16:48:17.118517 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerStarted","Data":"9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640"} Dec 05 16:48:20 crc kubenswrapper[4778]: I1205 16:48:20.144681 4778 generic.go:334] "Generic (PLEG): container finished" podID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" containerID="9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640" exitCode=1 Dec 05 16:48:20 crc kubenswrapper[4778]: I1205 16:48:20.144778 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerDied","Data":"9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640"} Dec 05 16:48:20 crc kubenswrapper[4778]: I1205 16:48:20.144947 4778 scope.go:117] "RemoveContainer" containerID="d0070438db29785f755542ccd4e36d811e2fb5c8990d8aa5a5cfc1e08eff3822" Dec 05 16:48:20 crc kubenswrapper[4778]: I1205 16:48:20.145568 4778 scope.go:117] "RemoveContainer" containerID="9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640" Dec 05 16:48:20 crc kubenswrapper[4778]: E1205 16:48:20.145779 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:48:21 crc kubenswrapper[4778]: I1205 16:48:21.744512 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:48:21 crc kubenswrapper[4778]: I1205 16:48:21.744578 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:48:21 crc kubenswrapper[4778]: I1205 16:48:21.744593 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:48:21 crc kubenswrapper[4778]: I1205 16:48:21.744604 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:48:21 crc kubenswrapper[4778]: I1205 16:48:21.745289 4778 scope.go:117] "RemoveContainer" containerID="9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640" Dec 05 16:48:21 crc kubenswrapper[4778]: E1205 16:48:21.745546 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:48:25 crc kubenswrapper[4778]: I1205 16:48:25.249603 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:48:25 crc kubenswrapper[4778]: E1205 16:48:25.250281 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:48:35 crc kubenswrapper[4778]: I1205 16:48:35.249514 4778 scope.go:117] "RemoveContainer" containerID="9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640" Dec 05 16:48:35 crc kubenswrapper[4778]: E1205 16:48:35.250281 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:48:38 crc kubenswrapper[4778]: I1205 16:48:38.249719 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:48:39 crc kubenswrapper[4778]: I1205 16:48:39.282353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"5e27f38bc69e604af8eadcc3f113811915e11ea2ac4830d8c03bf5bb2b0e0f8f"} Dec 05 16:48:46 crc kubenswrapper[4778]: I1205 16:48:46.250424 4778 scope.go:117] "RemoveContainer" containerID="9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640" Dec 05 16:48:46 crc kubenswrapper[4778]: E1205 16:48:46.251541 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:48:57 crc kubenswrapper[4778]: I1205 16:48:57.251710 4778 scope.go:117] "RemoveContainer" containerID="9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640" Dec 05 16:48:57 crc kubenswrapper[4778]: E1205 16:48:57.252579 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:49:09 crc kubenswrapper[4778]: I1205 16:49:09.249135 4778 scope.go:117] "RemoveContainer" containerID="9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640" Dec 05 16:49:09 crc kubenswrapper[4778]: E1205 16:49:09.249737 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:49:22 crc kubenswrapper[4778]: I1205 16:49:22.249101 4778 scope.go:117] "RemoveContainer" containerID="9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640" Dec 05 16:49:22 crc kubenswrapper[4778]: E1205 16:49:22.249809 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:49:36 crc kubenswrapper[4778]: I1205 16:49:36.249598 4778 scope.go:117] "RemoveContainer" containerID="9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640" Dec 05 16:49:36 crc kubenswrapper[4778]: E1205 16:49:36.250742 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:49:48 crc kubenswrapper[4778]: I1205 16:49:48.249763 4778 scope.go:117] "RemoveContainer" containerID="9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640" Dec 05 16:49:48 crc kubenswrapper[4778]: I1205 16:49:48.891047 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerStarted","Data":"cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085"} Dec 05 16:49:51 crc kubenswrapper[4778]: I1205 16:49:51.744149 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:49:51 crc kubenswrapper[4778]: I1205 16:49:51.744470 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:49:51 crc kubenswrapper[4778]: E1205 16:49:51.744641 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085 is running failed: container process not found" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 16:49:51 crc kubenswrapper[4778]: E1205 16:49:51.745181 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085 is running failed: container process not found" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 16:49:51 crc kubenswrapper[4778]: E1205 16:49:51.745457 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085 is running failed: container process not found" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 16:49:51 crc kubenswrapper[4778]: E1205 16:49:51.745508 4778 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085 is running failed: container process not found" probeType="Startup" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" containerName="watcher-decision-engine" Dec 05 16:49:51 crc kubenswrapper[4778]: I1205 16:49:51.915224 4778 generic.go:334] "Generic (PLEG): container finished" podID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" exitCode=1 Dec 05 16:49:51 crc kubenswrapper[4778]: I1205 16:49:51.915267 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerDied","Data":"cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085"} Dec 05 16:49:51 crc kubenswrapper[4778]: I1205 16:49:51.915298 4778 scope.go:117] "RemoveContainer" containerID="9a575969ff88a6b7ce71aba4a64e22996f22fc7c697fb40b380dc05689e58640" Dec 05 16:49:51 crc kubenswrapper[4778]: I1205 16:49:51.915890 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:49:51 crc kubenswrapper[4778]: E1205 16:49:51.916111 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:50:01 crc kubenswrapper[4778]: I1205 16:50:01.744549 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:50:01 crc kubenswrapper[4778]: I1205 16:50:01.745909 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:50:01 crc kubenswrapper[4778]: E1205 16:50:01.746238 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:50:17 crc kubenswrapper[4778]: I1205 16:50:17.250002 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:50:17 crc kubenswrapper[4778]: E1205 16:50:17.251061 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:50:21 crc kubenswrapper[4778]: I1205 16:50:21.743808 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:50:21 crc kubenswrapper[4778]: I1205 16:50:21.745175 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:50:21 crc kubenswrapper[4778]: E1205 16:50:21.745435 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.350007 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8nkhw"] Dec 05 16:50:22 crc kubenswrapper[4778]: E1205 16:50:22.350790 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" containerName="registry-server" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.350832 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" containerName="registry-server" Dec 05 16:50:22 crc kubenswrapper[4778]: E1205 16:50:22.350856 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" containerName="extract-utilities" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.350873 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" containerName="extract-utilities" Dec 05 16:50:22 crc kubenswrapper[4778]: E1205 16:50:22.350925 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" containerName="extract-content" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.350942 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" containerName="extract-content" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.351342 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9ebd54-1db7-407a-a41e-d01d9c6ddf06" containerName="registry-server" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.354164 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.359611 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8nkhw"] Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.420236 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqzff\" (UniqueName: \"kubernetes.io/projected/b29ec1de-909a-4b9b-b03d-6c65ce526923-kube-api-access-qqzff\") pod \"certified-operators-8nkhw\" (UID: \"b29ec1de-909a-4b9b-b03d-6c65ce526923\") " pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.420335 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29ec1de-909a-4b9b-b03d-6c65ce526923-utilities\") pod \"certified-operators-8nkhw\" (UID: \"b29ec1de-909a-4b9b-b03d-6c65ce526923\") " pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.420593 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29ec1de-909a-4b9b-b03d-6c65ce526923-catalog-content\") pod \"certified-operators-8nkhw\" (UID: \"b29ec1de-909a-4b9b-b03d-6c65ce526923\") " pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.521932 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqzff\" (UniqueName: \"kubernetes.io/projected/b29ec1de-909a-4b9b-b03d-6c65ce526923-kube-api-access-qqzff\") pod \"certified-operators-8nkhw\" (UID: \"b29ec1de-909a-4b9b-b03d-6c65ce526923\") " pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.522252 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29ec1de-909a-4b9b-b03d-6c65ce526923-utilities\") pod \"certified-operators-8nkhw\" (UID: \"b29ec1de-909a-4b9b-b03d-6c65ce526923\") " pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.522417 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29ec1de-909a-4b9b-b03d-6c65ce526923-catalog-content\") pod \"certified-operators-8nkhw\" (UID: \"b29ec1de-909a-4b9b-b03d-6c65ce526923\") " pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.522956 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29ec1de-909a-4b9b-b03d-6c65ce526923-utilities\") pod \"certified-operators-8nkhw\" (UID: \"b29ec1de-909a-4b9b-b03d-6c65ce526923\") " pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.523144 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29ec1de-909a-4b9b-b03d-6c65ce526923-catalog-content\") pod \"certified-operators-8nkhw\" (UID: \"b29ec1de-909a-4b9b-b03d-6c65ce526923\") " pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.549282 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqzff\" (UniqueName: \"kubernetes.io/projected/b29ec1de-909a-4b9b-b03d-6c65ce526923-kube-api-access-qqzff\") pod \"certified-operators-8nkhw\" (UID: \"b29ec1de-909a-4b9b-b03d-6c65ce526923\") " pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:22 crc kubenswrapper[4778]: I1205 16:50:22.692848 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:23 crc kubenswrapper[4778]: I1205 16:50:23.421844 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8nkhw"] Dec 05 16:50:24 crc kubenswrapper[4778]: I1205 16:50:24.220574 4778 generic.go:334] "Generic (PLEG): container finished" podID="b29ec1de-909a-4b9b-b03d-6c65ce526923" containerID="100b790c372c392ccfc2f9987b93e2b01b32cea8ce56d9bdba7cb5a84c956413" exitCode=0 Dec 05 16:50:24 crc kubenswrapper[4778]: I1205 16:50:24.220621 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nkhw" event={"ID":"b29ec1de-909a-4b9b-b03d-6c65ce526923","Type":"ContainerDied","Data":"100b790c372c392ccfc2f9987b93e2b01b32cea8ce56d9bdba7cb5a84c956413"} Dec 05 16:50:24 crc kubenswrapper[4778]: I1205 16:50:24.220649 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nkhw" event={"ID":"b29ec1de-909a-4b9b-b03d-6c65ce526923","Type":"ContainerStarted","Data":"eafe082f3fa436887f429f2d2f4c2d7e53178d0ab27eeaeb2f01a60cc6a9e29b"} Dec 05 16:50:24 crc kubenswrapper[4778]: I1205 16:50:24.222651 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:50:25 crc kubenswrapper[4778]: I1205 16:50:25.240020 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nkhw" event={"ID":"b29ec1de-909a-4b9b-b03d-6c65ce526923","Type":"ContainerStarted","Data":"4ba627c42800aa9b6a1b88f7498e73a1b13e91e1a484fe7fe516344de45a09cd"} Dec 05 16:50:26 crc kubenswrapper[4778]: I1205 16:50:26.252285 4778 generic.go:334] "Generic (PLEG): container finished" podID="b29ec1de-909a-4b9b-b03d-6c65ce526923" containerID="4ba627c42800aa9b6a1b88f7498e73a1b13e91e1a484fe7fe516344de45a09cd" exitCode=0 Dec 05 16:50:26 crc kubenswrapper[4778]: I1205 16:50:26.252343 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nkhw" event={"ID":"b29ec1de-909a-4b9b-b03d-6c65ce526923","Type":"ContainerDied","Data":"4ba627c42800aa9b6a1b88f7498e73a1b13e91e1a484fe7fe516344de45a09cd"} Dec 05 16:50:27 crc kubenswrapper[4778]: I1205 16:50:27.262777 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nkhw" event={"ID":"b29ec1de-909a-4b9b-b03d-6c65ce526923","Type":"ContainerStarted","Data":"041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa"} Dec 05 16:50:27 crc kubenswrapper[4778]: I1205 16:50:27.284127 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8nkhw" podStartSLOduration=2.613191869 podStartE2EDuration="5.284107724s" podCreationTimestamp="2025-12-05 16:50:22 +0000 UTC" firstStartedPulling="2025-12-05 16:50:24.222356645 +0000 UTC m=+3311.326153045" lastFinishedPulling="2025-12-05 16:50:26.89327252 +0000 UTC m=+3313.997068900" observedRunningTime="2025-12-05 16:50:27.279398365 +0000 UTC m=+3314.383194745" watchObservedRunningTime="2025-12-05 16:50:27.284107724 +0000 UTC m=+3314.387904104" Dec 05 16:50:32 crc kubenswrapper[4778]: I1205 16:50:32.693016 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:32 crc kubenswrapper[4778]: I1205 16:50:32.693552 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:32 crc kubenswrapper[4778]: I1205 16:50:32.746245 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:33 crc kubenswrapper[4778]: I1205 16:50:33.373819 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:36 crc kubenswrapper[4778]: I1205 16:50:36.340543 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8nkhw"] Dec 05 16:50:36 crc kubenswrapper[4778]: I1205 16:50:36.341185 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8nkhw" podUID="b29ec1de-909a-4b9b-b03d-6c65ce526923" containerName="registry-server" containerID="cri-o://041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa" gracePeriod=2 Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.248959 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:50:37 crc kubenswrapper[4778]: E1205 16:50:37.249427 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.312066 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.370350 4778 generic.go:334] "Generic (PLEG): container finished" podID="b29ec1de-909a-4b9b-b03d-6c65ce526923" containerID="041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa" exitCode=0 Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.370418 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nkhw" event={"ID":"b29ec1de-909a-4b9b-b03d-6c65ce526923","Type":"ContainerDied","Data":"041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa"} Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.370423 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nkhw" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.370441 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nkhw" event={"ID":"b29ec1de-909a-4b9b-b03d-6c65ce526923","Type":"ContainerDied","Data":"eafe082f3fa436887f429f2d2f4c2d7e53178d0ab27eeaeb2f01a60cc6a9e29b"} Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.370457 4778 scope.go:117] "RemoveContainer" containerID="041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.392624 4778 scope.go:117] "RemoveContainer" containerID="4ba627c42800aa9b6a1b88f7498e73a1b13e91e1a484fe7fe516344de45a09cd" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.410197 4778 scope.go:117] "RemoveContainer" containerID="100b790c372c392ccfc2f9987b93e2b01b32cea8ce56d9bdba7cb5a84c956413" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.446155 4778 scope.go:117] "RemoveContainer" containerID="041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa" Dec 05 16:50:37 crc kubenswrapper[4778]: E1205 16:50:37.446609 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa\": container with ID starting with 041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa not found: ID does not exist" containerID="041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.446649 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa"} err="failed to get container status \"041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa\": rpc error: code = NotFound desc = could not find container \"041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa\": container with ID starting with 041acfc32000edcc0a4fab2ec3d0a129399ccf93718af7daa5bc67a13a71ecfa not found: ID does not exist" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.446676 4778 scope.go:117] "RemoveContainer" containerID="4ba627c42800aa9b6a1b88f7498e73a1b13e91e1a484fe7fe516344de45a09cd" Dec 05 16:50:37 crc kubenswrapper[4778]: E1205 16:50:37.446941 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba627c42800aa9b6a1b88f7498e73a1b13e91e1a484fe7fe516344de45a09cd\": container with ID starting with 4ba627c42800aa9b6a1b88f7498e73a1b13e91e1a484fe7fe516344de45a09cd not found: ID does not exist" containerID="4ba627c42800aa9b6a1b88f7498e73a1b13e91e1a484fe7fe516344de45a09cd" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.446967 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba627c42800aa9b6a1b88f7498e73a1b13e91e1a484fe7fe516344de45a09cd"} err="failed to get container status \"4ba627c42800aa9b6a1b88f7498e73a1b13e91e1a484fe7fe516344de45a09cd\": rpc error: code = NotFound desc = could not find container \"4ba627c42800aa9b6a1b88f7498e73a1b13e91e1a484fe7fe516344de45a09cd\": container with ID starting with 4ba627c42800aa9b6a1b88f7498e73a1b13e91e1a484fe7fe516344de45a09cd not found: ID does not exist" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.446985 4778 scope.go:117] "RemoveContainer" containerID="100b790c372c392ccfc2f9987b93e2b01b32cea8ce56d9bdba7cb5a84c956413" Dec 05 16:50:37 crc kubenswrapper[4778]: E1205 16:50:37.447359 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"100b790c372c392ccfc2f9987b93e2b01b32cea8ce56d9bdba7cb5a84c956413\": container with ID starting with 100b790c372c392ccfc2f9987b93e2b01b32cea8ce56d9bdba7cb5a84c956413 not found: ID does not exist" containerID="100b790c372c392ccfc2f9987b93e2b01b32cea8ce56d9bdba7cb5a84c956413" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.447405 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"100b790c372c392ccfc2f9987b93e2b01b32cea8ce56d9bdba7cb5a84c956413"} err="failed to get container status \"100b790c372c392ccfc2f9987b93e2b01b32cea8ce56d9bdba7cb5a84c956413\": rpc error: code = NotFound desc = could not find container \"100b790c372c392ccfc2f9987b93e2b01b32cea8ce56d9bdba7cb5a84c956413\": container with ID starting with 100b790c372c392ccfc2f9987b93e2b01b32cea8ce56d9bdba7cb5a84c956413 not found: ID does not exist" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.486839 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqzff\" (UniqueName: \"kubernetes.io/projected/b29ec1de-909a-4b9b-b03d-6c65ce526923-kube-api-access-qqzff\") pod \"b29ec1de-909a-4b9b-b03d-6c65ce526923\" (UID: \"b29ec1de-909a-4b9b-b03d-6c65ce526923\") " Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.486947 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29ec1de-909a-4b9b-b03d-6c65ce526923-utilities\") pod \"b29ec1de-909a-4b9b-b03d-6c65ce526923\" (UID: \"b29ec1de-909a-4b9b-b03d-6c65ce526923\") " Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.487073 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29ec1de-909a-4b9b-b03d-6c65ce526923-catalog-content\") pod \"b29ec1de-909a-4b9b-b03d-6c65ce526923\" (UID: \"b29ec1de-909a-4b9b-b03d-6c65ce526923\") " Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.488208 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29ec1de-909a-4b9b-b03d-6c65ce526923-utilities" (OuterVolumeSpecName: "utilities") pod "b29ec1de-909a-4b9b-b03d-6c65ce526923" (UID: "b29ec1de-909a-4b9b-b03d-6c65ce526923"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.492885 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b29ec1de-909a-4b9b-b03d-6c65ce526923-kube-api-access-qqzff" (OuterVolumeSpecName: "kube-api-access-qqzff") pod "b29ec1de-909a-4b9b-b03d-6c65ce526923" (UID: "b29ec1de-909a-4b9b-b03d-6c65ce526923"). InnerVolumeSpecName "kube-api-access-qqzff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.535738 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29ec1de-909a-4b9b-b03d-6c65ce526923-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b29ec1de-909a-4b9b-b03d-6c65ce526923" (UID: "b29ec1de-909a-4b9b-b03d-6c65ce526923"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.589469 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29ec1de-909a-4b9b-b03d-6c65ce526923-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.589833 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqzff\" (UniqueName: \"kubernetes.io/projected/b29ec1de-909a-4b9b-b03d-6c65ce526923-kube-api-access-qqzff\") on node \"crc\" DevicePath \"\"" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.589854 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29ec1de-909a-4b9b-b03d-6c65ce526923-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.703093 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8nkhw"] Dec 05 16:50:37 crc kubenswrapper[4778]: I1205 16:50:37.710577 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8nkhw"] Dec 05 16:50:39 crc kubenswrapper[4778]: I1205 16:50:39.268710 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b29ec1de-909a-4b9b-b03d-6c65ce526923" path="/var/lib/kubelet/pods/b29ec1de-909a-4b9b-b03d-6c65ce526923/volumes" Dec 05 16:50:52 crc kubenswrapper[4778]: I1205 16:50:52.249494 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:50:52 crc kubenswrapper[4778]: E1205 16:50:52.250164 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.544893 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-99v6r"] Dec 05 16:50:59 crc kubenswrapper[4778]: E1205 16:50:59.545654 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29ec1de-909a-4b9b-b03d-6c65ce526923" containerName="extract-utilities" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.545666 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29ec1de-909a-4b9b-b03d-6c65ce526923" containerName="extract-utilities" Dec 05 16:50:59 crc kubenswrapper[4778]: E1205 16:50:59.545675 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29ec1de-909a-4b9b-b03d-6c65ce526923" containerName="extract-content" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.545681 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29ec1de-909a-4b9b-b03d-6c65ce526923" containerName="extract-content" Dec 05 16:50:59 crc kubenswrapper[4778]: E1205 16:50:59.545698 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29ec1de-909a-4b9b-b03d-6c65ce526923" containerName="registry-server" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.545705 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29ec1de-909a-4b9b-b03d-6c65ce526923" containerName="registry-server" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.545855 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b29ec1de-909a-4b9b-b03d-6c65ce526923" containerName="registry-server" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.546911 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.569781 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-99v6r"] Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.591510 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-utilities\") pod \"redhat-operators-99v6r\" (UID: \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\") " pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.591636 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s72kb\" (UniqueName: \"kubernetes.io/projected/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-kube-api-access-s72kb\") pod \"redhat-operators-99v6r\" (UID: \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\") " pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.591712 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-catalog-content\") pod \"redhat-operators-99v6r\" (UID: \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\") " pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.692820 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-catalog-content\") pod \"redhat-operators-99v6r\" (UID: \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\") " pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.692964 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-utilities\") pod \"redhat-operators-99v6r\" (UID: \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\") " pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.693030 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s72kb\" (UniqueName: \"kubernetes.io/projected/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-kube-api-access-s72kb\") pod \"redhat-operators-99v6r\" (UID: \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\") " pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.693326 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-catalog-content\") pod \"redhat-operators-99v6r\" (UID: \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\") " pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.693382 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-utilities\") pod \"redhat-operators-99v6r\" (UID: \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\") " pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.716443 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s72kb\" (UniqueName: \"kubernetes.io/projected/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-kube-api-access-s72kb\") pod \"redhat-operators-99v6r\" (UID: \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\") " pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:50:59 crc kubenswrapper[4778]: I1205 16:50:59.876148 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:51:00 crc kubenswrapper[4778]: I1205 16:51:00.332609 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-99v6r"] Dec 05 16:51:00 crc kubenswrapper[4778]: I1205 16:51:00.571720 4778 generic.go:334] "Generic (PLEG): container finished" podID="8e7d3ada-ada1-4f1b-a877-7e0b4747485f" containerID="c01555ca8d0cf32ac7d8fb602ebeac501e4f3fcd6febd1af6ddae538853f49fe" exitCode=0 Dec 05 16:51:00 crc kubenswrapper[4778]: I1205 16:51:00.571776 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99v6r" event={"ID":"8e7d3ada-ada1-4f1b-a877-7e0b4747485f","Type":"ContainerDied","Data":"c01555ca8d0cf32ac7d8fb602ebeac501e4f3fcd6febd1af6ddae538853f49fe"} Dec 05 16:51:00 crc kubenswrapper[4778]: I1205 16:51:00.571808 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99v6r" event={"ID":"8e7d3ada-ada1-4f1b-a877-7e0b4747485f","Type":"ContainerStarted","Data":"92fb85cfc3823a97103133b72cf15813276901a00e49bde156cdfc23fa5179c6"} Dec 05 16:51:01 crc kubenswrapper[4778]: I1205 16:51:01.580824 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99v6r" event={"ID":"8e7d3ada-ada1-4f1b-a877-7e0b4747485f","Type":"ContainerStarted","Data":"f3de8e94d6c7ed614fd67eec45a7121c08bb7ae04da71b14f6201c3450043c94"} Dec 05 16:51:02 crc kubenswrapper[4778]: I1205 16:51:02.589853 4778 generic.go:334] "Generic (PLEG): container finished" podID="8e7d3ada-ada1-4f1b-a877-7e0b4747485f" containerID="f3de8e94d6c7ed614fd67eec45a7121c08bb7ae04da71b14f6201c3450043c94" exitCode=0 Dec 05 16:51:02 crc kubenswrapper[4778]: I1205 16:51:02.589908 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99v6r" event={"ID":"8e7d3ada-ada1-4f1b-a877-7e0b4747485f","Type":"ContainerDied","Data":"f3de8e94d6c7ed614fd67eec45a7121c08bb7ae04da71b14f6201c3450043c94"} Dec 05 16:51:03 crc kubenswrapper[4778]: I1205 16:51:03.414582 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:51:03 crc kubenswrapper[4778]: I1205 16:51:03.414665 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:51:03 crc kubenswrapper[4778]: I1205 16:51:03.600166 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99v6r" event={"ID":"8e7d3ada-ada1-4f1b-a877-7e0b4747485f","Type":"ContainerStarted","Data":"fa6f79a1f835c2a99b004cc278b728fe122a756facf3ccb54d01ec0affe5dac0"} Dec 05 16:51:03 crc kubenswrapper[4778]: I1205 16:51:03.621099 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-99v6r" podStartSLOduration=1.99851344 podStartE2EDuration="4.621078786s" podCreationTimestamp="2025-12-05 16:50:59 +0000 UTC" firstStartedPulling="2025-12-05 16:51:00.57375834 +0000 UTC m=+3347.677554730" lastFinishedPulling="2025-12-05 16:51:03.196323696 +0000 UTC m=+3350.300120076" observedRunningTime="2025-12-05 16:51:03.617510389 +0000 UTC m=+3350.721306779" watchObservedRunningTime="2025-12-05 16:51:03.621078786 +0000 UTC m=+3350.724875176" Dec 05 16:51:06 crc kubenswrapper[4778]: I1205 16:51:06.249760 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:51:06 crc kubenswrapper[4778]: E1205 16:51:06.250291 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:51:09 crc kubenswrapper[4778]: I1205 16:51:09.876434 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:51:09 crc kubenswrapper[4778]: I1205 16:51:09.877280 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:51:09 crc kubenswrapper[4778]: I1205 16:51:09.933140 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:51:10 crc kubenswrapper[4778]: I1205 16:51:10.744094 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:51:13 crc kubenswrapper[4778]: I1205 16:51:13.541381 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-99v6r"] Dec 05 16:51:13 crc kubenswrapper[4778]: I1205 16:51:13.704477 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-99v6r" podUID="8e7d3ada-ada1-4f1b-a877-7e0b4747485f" containerName="registry-server" containerID="cri-o://fa6f79a1f835c2a99b004cc278b728fe122a756facf3ccb54d01ec0affe5dac0" gracePeriod=2 Dec 05 16:51:15 crc kubenswrapper[4778]: I1205 16:51:15.738270 4778 generic.go:334] "Generic (PLEG): container finished" podID="8e7d3ada-ada1-4f1b-a877-7e0b4747485f" containerID="fa6f79a1f835c2a99b004cc278b728fe122a756facf3ccb54d01ec0affe5dac0" exitCode=0 Dec 05 16:51:15 crc kubenswrapper[4778]: I1205 16:51:15.738339 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99v6r" event={"ID":"8e7d3ada-ada1-4f1b-a877-7e0b4747485f","Type":"ContainerDied","Data":"fa6f79a1f835c2a99b004cc278b728fe122a756facf3ccb54d01ec0affe5dac0"} Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.011689 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.165160 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-utilities\") pod \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\" (UID: \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\") " Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.165282 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-catalog-content\") pod \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\" (UID: \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\") " Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.165313 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s72kb\" (UniqueName: \"kubernetes.io/projected/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-kube-api-access-s72kb\") pod \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\" (UID: \"8e7d3ada-ada1-4f1b-a877-7e0b4747485f\") " Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.166215 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-utilities" (OuterVolumeSpecName: "utilities") pod "8e7d3ada-ada1-4f1b-a877-7e0b4747485f" (UID: "8e7d3ada-ada1-4f1b-a877-7e0b4747485f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.171010 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-kube-api-access-s72kb" (OuterVolumeSpecName: "kube-api-access-s72kb") pod "8e7d3ada-ada1-4f1b-a877-7e0b4747485f" (UID: "8e7d3ada-ada1-4f1b-a877-7e0b4747485f"). InnerVolumeSpecName "kube-api-access-s72kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.267598 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.267634 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s72kb\" (UniqueName: \"kubernetes.io/projected/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-kube-api-access-s72kb\") on node \"crc\" DevicePath \"\"" Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.307603 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e7d3ada-ada1-4f1b-a877-7e0b4747485f" (UID: "8e7d3ada-ada1-4f1b-a877-7e0b4747485f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.369034 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3ada-ada1-4f1b-a877-7e0b4747485f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.748061 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99v6r" event={"ID":"8e7d3ada-ada1-4f1b-a877-7e0b4747485f","Type":"ContainerDied","Data":"92fb85cfc3823a97103133b72cf15813276901a00e49bde156cdfc23fa5179c6"} Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.749064 4778 scope.go:117] "RemoveContainer" containerID="fa6f79a1f835c2a99b004cc278b728fe122a756facf3ccb54d01ec0affe5dac0" Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.748124 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99v6r" Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.768269 4778 scope.go:117] "RemoveContainer" containerID="f3de8e94d6c7ed614fd67eec45a7121c08bb7ae04da71b14f6201c3450043c94" Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.783087 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-99v6r"] Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.791667 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-99v6r"] Dec 05 16:51:16 crc kubenswrapper[4778]: I1205 16:51:16.801325 4778 scope.go:117] "RemoveContainer" containerID="c01555ca8d0cf32ac7d8fb602ebeac501e4f3fcd6febd1af6ddae538853f49fe" Dec 05 16:51:17 crc kubenswrapper[4778]: I1205 16:51:17.249586 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:51:17 crc kubenswrapper[4778]: E1205 16:51:17.250092 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:51:17 crc kubenswrapper[4778]: I1205 16:51:17.258001 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e7d3ada-ada1-4f1b-a877-7e0b4747485f" path="/var/lib/kubelet/pods/8e7d3ada-ada1-4f1b-a877-7e0b4747485f/volumes" Dec 05 16:51:30 crc kubenswrapper[4778]: I1205 16:51:30.249625 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:51:30 crc kubenswrapper[4778]: E1205 16:51:30.250337 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:51:33 crc kubenswrapper[4778]: I1205 16:51:33.414762 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:51:33 crc kubenswrapper[4778]: I1205 16:51:33.416624 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.186396 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5wx5k/must-gather-w578h"] Dec 05 16:51:35 crc kubenswrapper[4778]: E1205 16:51:35.186987 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7d3ada-ada1-4f1b-a877-7e0b4747485f" containerName="extract-content" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.187000 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7d3ada-ada1-4f1b-a877-7e0b4747485f" containerName="extract-content" Dec 05 16:51:35 crc kubenswrapper[4778]: E1205 16:51:35.187014 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7d3ada-ada1-4f1b-a877-7e0b4747485f" containerName="registry-server" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.187022 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7d3ada-ada1-4f1b-a877-7e0b4747485f" containerName="registry-server" Dec 05 16:51:35 crc kubenswrapper[4778]: E1205 16:51:35.187046 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7d3ada-ada1-4f1b-a877-7e0b4747485f" containerName="extract-utilities" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.187053 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7d3ada-ada1-4f1b-a877-7e0b4747485f" containerName="extract-utilities" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.188768 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7d3ada-ada1-4f1b-a877-7e0b4747485f" containerName="registry-server" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.189736 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wx5k/must-gather-w578h" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.193186 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5wx5k"/"kube-root-ca.crt" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.193258 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5wx5k"/"openshift-service-ca.crt" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.205877 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5wx5k/must-gather-w578h"] Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.387462 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3-must-gather-output\") pod \"must-gather-w578h\" (UID: \"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3\") " pod="openshift-must-gather-5wx5k/must-gather-w578h" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.387504 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cdfg\" (UniqueName: \"kubernetes.io/projected/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3-kube-api-access-7cdfg\") pod \"must-gather-w578h\" (UID: \"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3\") " pod="openshift-must-gather-5wx5k/must-gather-w578h" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.488703 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3-must-gather-output\") pod \"must-gather-w578h\" (UID: \"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3\") " pod="openshift-must-gather-5wx5k/must-gather-w578h" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.489198 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3-must-gather-output\") pod \"must-gather-w578h\" (UID: \"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3\") " pod="openshift-must-gather-5wx5k/must-gather-w578h" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.488749 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cdfg\" (UniqueName: \"kubernetes.io/projected/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3-kube-api-access-7cdfg\") pod \"must-gather-w578h\" (UID: \"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3\") " pod="openshift-must-gather-5wx5k/must-gather-w578h" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.526564 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cdfg\" (UniqueName: \"kubernetes.io/projected/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3-kube-api-access-7cdfg\") pod \"must-gather-w578h\" (UID: \"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3\") " pod="openshift-must-gather-5wx5k/must-gather-w578h" Dec 05 16:51:35 crc kubenswrapper[4778]: I1205 16:51:35.824976 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wx5k/must-gather-w578h" Dec 05 16:51:36 crc kubenswrapper[4778]: I1205 16:51:36.145260 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5wx5k/must-gather-w578h"] Dec 05 16:51:36 crc kubenswrapper[4778]: I1205 16:51:36.948174 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wx5k/must-gather-w578h" event={"ID":"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3","Type":"ContainerStarted","Data":"9dd94cacb0072b81e4e8830ca0a2f9462fb578598622166c97aabb8e6736216e"} Dec 05 16:51:40 crc kubenswrapper[4778]: I1205 16:51:40.983877 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wx5k/must-gather-w578h" event={"ID":"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3","Type":"ContainerStarted","Data":"be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2"} Dec 05 16:51:41 crc kubenswrapper[4778]: I1205 16:51:41.992513 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wx5k/must-gather-w578h" event={"ID":"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3","Type":"ContainerStarted","Data":"4393d61bb90671a6004faddece9e64fbe19b40e8fecb97f67e6846a13c8a6fc7"} Dec 05 16:51:42 crc kubenswrapper[4778]: I1205 16:51:42.025226 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5wx5k/must-gather-w578h" podStartSLOduration=2.633006059 podStartE2EDuration="7.025201849s" podCreationTimestamp="2025-12-05 16:51:35 +0000 UTC" firstStartedPulling="2025-12-05 16:51:36.154522389 +0000 UTC m=+3383.258318769" lastFinishedPulling="2025-12-05 16:51:40.546718179 +0000 UTC m=+3387.650514559" observedRunningTime="2025-12-05 16:51:42.019085742 +0000 UTC m=+3389.122882122" watchObservedRunningTime="2025-12-05 16:51:42.025201849 +0000 UTC m=+3389.128998239" Dec 05 16:51:43 crc kubenswrapper[4778]: I1205 16:51:43.256453 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:51:43 crc kubenswrapper[4778]: E1205 16:51:43.256726 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:51:56 crc kubenswrapper[4778]: I1205 16:51:56.250075 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:51:56 crc kubenswrapper[4778]: E1205 16:51:56.250814 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:52:03 crc kubenswrapper[4778]: I1205 16:52:03.414471 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:52:03 crc kubenswrapper[4778]: I1205 16:52:03.414946 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:52:03 crc kubenswrapper[4778]: I1205 16:52:03.415033 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:52:03 crc kubenswrapper[4778]: I1205 16:52:03.415986 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e27f38bc69e604af8eadcc3f113811915e11ea2ac4830d8c03bf5bb2b0e0f8f"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:52:03 crc kubenswrapper[4778]: I1205 16:52:03.416056 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://5e27f38bc69e604af8eadcc3f113811915e11ea2ac4830d8c03bf5bb2b0e0f8f" gracePeriod=600 Dec 05 16:52:04 crc kubenswrapper[4778]: I1205 16:52:04.148645 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="5e27f38bc69e604af8eadcc3f113811915e11ea2ac4830d8c03bf5bb2b0e0f8f" exitCode=0 Dec 05 16:52:04 crc kubenswrapper[4778]: I1205 16:52:04.148717 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"5e27f38bc69e604af8eadcc3f113811915e11ea2ac4830d8c03bf5bb2b0e0f8f"} Dec 05 16:52:04 crc kubenswrapper[4778]: I1205 16:52:04.148916 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450"} Dec 05 16:52:04 crc kubenswrapper[4778]: I1205 16:52:04.148940 4778 scope.go:117] "RemoveContainer" containerID="3604c564f14d1ae160d7f88daf8e98d0cf2b3599614ded0e08ee7e24d4ef1dd5" Dec 05 16:52:08 crc kubenswrapper[4778]: I1205 16:52:08.249349 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:52:08 crc kubenswrapper[4778]: E1205 16:52:08.258694 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:52:20 crc kubenswrapper[4778]: I1205 16:52:20.250004 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:52:20 crc kubenswrapper[4778]: E1205 16:52:20.250861 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.154534 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x79b8"] Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.157042 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.168813 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x79b8"] Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.250149 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.299195 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6465f\" (UniqueName: \"kubernetes.io/projected/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-kube-api-access-6465f\") pod \"community-operators-x79b8\" (UID: \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\") " pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.299252 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-catalog-content\") pod \"community-operators-x79b8\" (UID: \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\") " pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.299384 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-utilities\") pod \"community-operators-x79b8\" (UID: \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\") " pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.401298 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-utilities\") pod \"community-operators-x79b8\" (UID: \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\") " pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.401707 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6465f\" (UniqueName: \"kubernetes.io/projected/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-kube-api-access-6465f\") pod \"community-operators-x79b8\" (UID: \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\") " pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.401753 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-catalog-content\") pod \"community-operators-x79b8\" (UID: \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\") " pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.402197 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-utilities\") pod \"community-operators-x79b8\" (UID: \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\") " pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.403229 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-catalog-content\") pod \"community-operators-x79b8\" (UID: \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\") " pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.428140 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6465f\" (UniqueName: \"kubernetes.io/projected/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-kube-api-access-6465f\") pod \"community-operators-x79b8\" (UID: \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\") " pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:34 crc kubenswrapper[4778]: I1205 16:52:34.478633 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:35 crc kubenswrapper[4778]: I1205 16:52:35.023706 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x79b8"] Dec 05 16:52:35 crc kubenswrapper[4778]: W1205 16:52:35.026081 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda75fb65a_6b82_4afe_9c4f_264b0a4fc4c0.slice/crio-91fe008d69c1e762229a655bcb1d858aa32a519502081e9970d1cad945523bac WatchSource:0}: Error finding container 91fe008d69c1e762229a655bcb1d858aa32a519502081e9970d1cad945523bac: Status 404 returned error can't find the container with id 91fe008d69c1e762229a655bcb1d858aa32a519502081e9970d1cad945523bac Dec 05 16:52:35 crc kubenswrapper[4778]: I1205 16:52:35.383111 4778 generic.go:334] "Generic (PLEG): container finished" podID="a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" containerID="3390110f6f508f3f2d9a45c814fe3d10854f2eeccc5a2c4d21a6c3d3c0435701" exitCode=0 Dec 05 16:52:35 crc kubenswrapper[4778]: I1205 16:52:35.383670 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x79b8" event={"ID":"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0","Type":"ContainerDied","Data":"3390110f6f508f3f2d9a45c814fe3d10854f2eeccc5a2c4d21a6c3d3c0435701"} Dec 05 16:52:35 crc kubenswrapper[4778]: I1205 16:52:35.383711 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x79b8" event={"ID":"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0","Type":"ContainerStarted","Data":"91fe008d69c1e762229a655bcb1d858aa32a519502081e9970d1cad945523bac"} Dec 05 16:52:35 crc kubenswrapper[4778]: I1205 16:52:35.388936 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerStarted","Data":"0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778"} Dec 05 16:52:36 crc kubenswrapper[4778]: I1205 16:52:36.400752 4778 generic.go:334] "Generic (PLEG): container finished" podID="a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" containerID="a04a034f03fbbcb6f3d5ed51f0bcc1e0ce20aa585931074ac54c76dc3e0414b4" exitCode=0 Dec 05 16:52:36 crc kubenswrapper[4778]: I1205 16:52:36.400811 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x79b8" event={"ID":"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0","Type":"ContainerDied","Data":"a04a034f03fbbcb6f3d5ed51f0bcc1e0ce20aa585931074ac54c76dc3e0414b4"} Dec 05 16:52:37 crc kubenswrapper[4778]: I1205 16:52:37.411285 4778 generic.go:334] "Generic (PLEG): container finished" podID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" exitCode=1 Dec 05 16:52:37 crc kubenswrapper[4778]: I1205 16:52:37.411354 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerDied","Data":"0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778"} Dec 05 16:52:37 crc kubenswrapper[4778]: I1205 16:52:37.411672 4778 scope.go:117] "RemoveContainer" containerID="cff14accfb0d280767b57c0951d39ca5b39853da954bb29729bb6cd923a6c085" Dec 05 16:52:37 crc kubenswrapper[4778]: I1205 16:52:37.412278 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:52:37 crc kubenswrapper[4778]: E1205 16:52:37.412524 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:52:37 crc kubenswrapper[4778]: I1205 16:52:37.415313 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x79b8" event={"ID":"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0","Type":"ContainerStarted","Data":"09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190"} Dec 05 16:52:37 crc kubenswrapper[4778]: I1205 16:52:37.446713 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x79b8" podStartSLOduration=2.023650026 podStartE2EDuration="3.446692035s" podCreationTimestamp="2025-12-05 16:52:34 +0000 UTC" firstStartedPulling="2025-12-05 16:52:35.386162926 +0000 UTC m=+3442.489959306" lastFinishedPulling="2025-12-05 16:52:36.809204935 +0000 UTC m=+3443.913001315" observedRunningTime="2025-12-05 16:52:37.445232795 +0000 UTC m=+3444.549029175" watchObservedRunningTime="2025-12-05 16:52:37.446692035 +0000 UTC m=+3444.550488435" Dec 05 16:52:41 crc kubenswrapper[4778]: I1205 16:52:41.744836 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:52:41 crc kubenswrapper[4778]: I1205 16:52:41.745285 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:52:41 crc kubenswrapper[4778]: I1205 16:52:41.745723 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:52:41 crc kubenswrapper[4778]: E1205 16:52:41.745903 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:52:44 crc kubenswrapper[4778]: I1205 16:52:44.479639 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:44 crc kubenswrapper[4778]: I1205 16:52:44.480730 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:44 crc kubenswrapper[4778]: I1205 16:52:44.523851 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt_c5af7422-412e-468f-8b0c-dee56152cbfd/util/0.log" Dec 05 16:52:44 crc kubenswrapper[4778]: I1205 16:52:44.531199 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:44 crc kubenswrapper[4778]: I1205 16:52:44.721235 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt_c5af7422-412e-468f-8b0c-dee56152cbfd/util/0.log" Dec 05 16:52:44 crc kubenswrapper[4778]: I1205 16:52:44.789331 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt_c5af7422-412e-468f-8b0c-dee56152cbfd/pull/0.log" Dec 05 16:52:44 crc kubenswrapper[4778]: I1205 16:52:44.806713 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt_c5af7422-412e-468f-8b0c-dee56152cbfd/pull/0.log" Dec 05 16:52:44 crc kubenswrapper[4778]: I1205 16:52:44.973903 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt_c5af7422-412e-468f-8b0c-dee56152cbfd/util/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.082097 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt_c5af7422-412e-468f-8b0c-dee56152cbfd/pull/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.103459 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_43d3879e62b1a786f2544f7bdd094d49d1a004d9556d4afb8c64e444964bbkt_c5af7422-412e-468f-8b0c-dee56152cbfd/extract/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.209609 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd_6a9f95dc-00ca-4792-9fe4-3b25e68b80fd/util/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.360848 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd_6a9f95dc-00ca-4792-9fe4-3b25e68b80fd/pull/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.390130 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd_6a9f95dc-00ca-4792-9fe4-3b25e68b80fd/util/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.413447 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd_6a9f95dc-00ca-4792-9fe4-3b25e68b80fd/pull/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.526875 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.605878 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd_6a9f95dc-00ca-4792-9fe4-3b25e68b80fd/extract/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.615817 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd_6a9f95dc-00ca-4792-9fe4-3b25e68b80fd/util/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.644202 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b04e22b5302ca5d8c540e2294979eca7e76b3503e736a2e22abe550bff8sjcd_6a9f95dc-00ca-4792-9fe4-3b25e68b80fd/pull/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.772267 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-krhmn_eb7ca6ca-0075-46eb-9a5c-e445d06c3425/kube-rbac-proxy/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.833998 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-krhmn_eb7ca6ca-0075-46eb-9a5c-e445d06c3425/manager/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.884939 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-l8jdl_1e0f0d87-e234-4800-847d-694de5f7dd68/kube-rbac-proxy/0.log" Dec 05 16:52:45 crc kubenswrapper[4778]: I1205 16:52:45.968719 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-l8jdl_1e0f0d87-e234-4800-847d-694de5f7dd68/manager/0.log" Dec 05 16:52:46 crc kubenswrapper[4778]: I1205 16:52:46.050108 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-7cqh2_5f6baea9-4909-4365-8091-d1d4acba26bd/kube-rbac-proxy/0.log" Dec 05 16:52:46 crc kubenswrapper[4778]: I1205 16:52:46.072350 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-7cqh2_5f6baea9-4909-4365-8091-d1d4acba26bd/manager/0.log" Dec 05 16:52:46 crc kubenswrapper[4778]: I1205 16:52:46.226623 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-kwnlh_a1f52513-b5c6-45ac-9cf7-42e04ba8b114/kube-rbac-proxy/0.log" Dec 05 16:52:46 crc kubenswrapper[4778]: I1205 16:52:46.232107 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-kwnlh_a1f52513-b5c6-45ac-9cf7-42e04ba8b114/manager/0.log" Dec 05 16:52:46 crc kubenswrapper[4778]: I1205 16:52:46.420683 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-tqqnh_219350c9-1342-44bb-82d0-6a121ebb354b/kube-rbac-proxy/0.log" Dec 05 16:52:46 crc kubenswrapper[4778]: I1205 16:52:46.432805 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-tqqnh_219350c9-1342-44bb-82d0-6a121ebb354b/manager/0.log" Dec 05 16:52:46 crc kubenswrapper[4778]: I1205 16:52:46.566974 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-hgg7b_fe21af78-21d5-440b-977a-1accce9c5ed3/kube-rbac-proxy/0.log" Dec 05 16:52:46 crc kubenswrapper[4778]: I1205 16:52:46.651048 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-hgg7b_fe21af78-21d5-440b-977a-1accce9c5ed3/manager/0.log" Dec 05 16:52:46 crc kubenswrapper[4778]: I1205 16:52:46.701205 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-8htgg_95b0b5ac-33e1-430d-a501-f429b6ccb4fe/kube-rbac-proxy/0.log" Dec 05 16:52:46 crc kubenswrapper[4778]: I1205 16:52:46.810428 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-8htgg_95b0b5ac-33e1-430d-a501-f429b6ccb4fe/manager/0.log" Dec 05 16:52:47 crc kubenswrapper[4778]: I1205 16:52:47.043480 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-lqlll_c0add30f-d439-45c8-93e7-793a49ef95dc/kube-rbac-proxy/0.log" Dec 05 16:52:47 crc kubenswrapper[4778]: I1205 16:52:47.087850 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-lqlll_c0add30f-d439-45c8-93e7-793a49ef95dc/manager/0.log" Dec 05 16:52:47 crc kubenswrapper[4778]: I1205 16:52:47.222058 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-59jpj_3ba2c006-d33f-4179-8f24-73dcd6231085/kube-rbac-proxy/0.log" Dec 05 16:52:47 crc kubenswrapper[4778]: I1205 16:52:47.323589 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-59jpj_3ba2c006-d33f-4179-8f24-73dcd6231085/manager/0.log" Dec 05 16:52:47 crc kubenswrapper[4778]: I1205 16:52:47.412893 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-xhg2w_b52d37d6-575a-4fa3-96af-4d72413e41e3/kube-rbac-proxy/0.log" Dec 05 16:52:47 crc kubenswrapper[4778]: I1205 16:52:47.440736 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-xhg2w_b52d37d6-575a-4fa3-96af-4d72413e41e3/manager/0.log" Dec 05 16:52:47 crc kubenswrapper[4778]: I1205 16:52:47.542322 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-r6qhm_9333a12b-dae0-41c5-a41e-a42c94f5d668/kube-rbac-proxy/0.log" Dec 05 16:52:47 crc kubenswrapper[4778]: I1205 16:52:47.644553 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-r6qhm_9333a12b-dae0-41c5-a41e-a42c94f5d668/manager/0.log" Dec 05 16:52:47 crc kubenswrapper[4778]: I1205 16:52:47.746657 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-lvc5n_aee0787e-b460-4811-aaf5-3ad30d1ca069/manager/0.log" Dec 05 16:52:47 crc kubenswrapper[4778]: I1205 16:52:47.782182 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-lvc5n_aee0787e-b460-4811-aaf5-3ad30d1ca069/kube-rbac-proxy/0.log" Dec 05 16:52:47 crc kubenswrapper[4778]: I1205 16:52:47.945570 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-z2c7d_749eef58-2e11-4af9-80d5-b4ab23f257cc/kube-rbac-proxy/0.log" Dec 05 16:52:48 crc kubenswrapper[4778]: I1205 16:52:48.054358 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-z2c7d_749eef58-2e11-4af9-80d5-b4ab23f257cc/manager/0.log" Dec 05 16:52:48 crc kubenswrapper[4778]: I1205 16:52:48.121215 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-jx9x4_4b11c75e-cbea-4850-b090-a231f3908b53/kube-rbac-proxy/0.log" Dec 05 16:52:48 crc kubenswrapper[4778]: I1205 16:52:48.133952 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x79b8"] Dec 05 16:52:48 crc kubenswrapper[4778]: I1205 16:52:48.164242 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-jx9x4_4b11c75e-cbea-4850-b090-a231f3908b53/manager/0.log" Dec 05 16:52:48 crc kubenswrapper[4778]: I1205 16:52:48.304767 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw_662da656-0ba6-4ff7-85bd-6739ad5c5100/kube-rbac-proxy/0.log" Dec 05 16:52:48 crc kubenswrapper[4778]: I1205 16:52:48.304964 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd48lkrw_662da656-0ba6-4ff7-85bd-6739ad5c5100/manager/0.log" Dec 05 16:52:48 crc kubenswrapper[4778]: I1205 16:52:48.496672 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x79b8" podUID="a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" containerName="registry-server" containerID="cri-o://09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190" gracePeriod=2 Dec 05 16:52:48 crc kubenswrapper[4778]: I1205 16:52:48.622490 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-flgjp_b6ac2f11-7127-47bd-bd32-904e8383126b/registry-server/0.log" Dec 05 16:52:48 crc kubenswrapper[4778]: I1205 16:52:48.668889 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-57f879d6c4-clslw_65f4316e-afb7-4a97-b58e-89653f55ff4a/manager/0.log" Dec 05 16:52:48 crc kubenswrapper[4778]: I1205 16:52:48.725242 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-jxckc_a2f059f8-ee96-4e29-a00f-bee69430c802/kube-rbac-proxy/0.log" Dec 05 16:52:48 crc kubenswrapper[4778]: I1205 16:52:48.849811 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-jxckc_a2f059f8-ee96-4e29-a00f-bee69430c802/manager/0.log" Dec 05 16:52:48 crc kubenswrapper[4778]: I1205 16:52:48.954961 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4828x_0b521841-47d8-461f-a765-c9b7974bb4b7/kube-rbac-proxy/0.log" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.003577 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.063947 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4828x_0b521841-47d8-461f-a765-c9b7974bb4b7/manager/0.log" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.107808 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-b6727_48f6bc28-3426-42fb-9498-1280593297ea/operator/0.log" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.131596 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-utilities\") pod \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\" (UID: \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\") " Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.131731 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-catalog-content\") pod \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\" (UID: \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\") " Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.131761 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6465f\" (UniqueName: \"kubernetes.io/projected/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-kube-api-access-6465f\") pod \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\" (UID: \"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0\") " Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.132520 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-utilities" (OuterVolumeSpecName: "utilities") pod "a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" (UID: "a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.137582 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-kube-api-access-6465f" (OuterVolumeSpecName: "kube-api-access-6465f") pod "a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" (UID: "a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0"). InnerVolumeSpecName "kube-api-access-6465f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.191716 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" (UID: "a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.233050 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.233086 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.233099 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6465f\" (UniqueName: \"kubernetes.io/projected/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0-kube-api-access-6465f\") on node \"crc\" DevicePath \"\"" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.237910 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-rcr6s_996d89ac-bb27-41d3-9ea8-171d71c585e2/kube-rbac-proxy/0.log" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.320895 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-rcr6s_996d89ac-bb27-41d3-9ea8-171d71c585e2/manager/0.log" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.361869 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-hc7pn_862bfade-07c6-405c-bbca-e96341188a5c/kube-rbac-proxy/0.log" Dec 05 16:52:49 crc kubenswrapper[4778]: E1205 16:52:49.457963 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda75fb65a_6b82_4afe_9c4f_264b0a4fc4c0.slice\": RecentStats: unable to find data in memory cache]" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.480315 4778 scope.go:117] "RemoveContainer" containerID="969d2a3ffbb83a9c155fd74710df0776a1476837232b7239b0dc4bf7f9a76ed9" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.507079 4778 generic.go:334] "Generic (PLEG): container finished" podID="a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" containerID="09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190" exitCode=0 Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.507126 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x79b8" event={"ID":"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0","Type":"ContainerDied","Data":"09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190"} Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.507152 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x79b8" event={"ID":"a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0","Type":"ContainerDied","Data":"91fe008d69c1e762229a655bcb1d858aa32a519502081e9970d1cad945523bac"} Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.507167 4778 scope.go:117] "RemoveContainer" containerID="09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.507296 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x79b8" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.550517 4778 scope.go:117] "RemoveContainer" containerID="a04a034f03fbbcb6f3d5ed51f0bcc1e0ce20aa585931074ac54c76dc3e0414b4" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.554225 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x79b8"] Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.561388 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x79b8"] Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.574451 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-mfkct_33e87b02-f57e-47bb-934b-159e59f2d7f5/kube-rbac-proxy/0.log" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.580125 4778 scope.go:117] "RemoveContainer" containerID="3390110f6f508f3f2d9a45c814fe3d10854f2eeccc5a2c4d21a6c3d3c0435701" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.601525 4778 scope.go:117] "RemoveContainer" containerID="09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190" Dec 05 16:52:49 crc kubenswrapper[4778]: E1205 16:52:49.602034 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190\": container with ID starting with 09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190 not found: ID does not exist" containerID="09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.602155 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190"} err="failed to get container status \"09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190\": rpc error: code = NotFound desc = could not find container \"09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190\": container with ID starting with 09a3ef9ebae43e2795b05d1abdda14b4dccb370a66826102045edc166dddc190 not found: ID does not exist" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.602257 4778 scope.go:117] "RemoveContainer" containerID="a04a034f03fbbcb6f3d5ed51f0bcc1e0ce20aa585931074ac54c76dc3e0414b4" Dec 05 16:52:49 crc kubenswrapper[4778]: E1205 16:52:49.602773 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a04a034f03fbbcb6f3d5ed51f0bcc1e0ce20aa585931074ac54c76dc3e0414b4\": container with ID starting with a04a034f03fbbcb6f3d5ed51f0bcc1e0ce20aa585931074ac54c76dc3e0414b4 not found: ID does not exist" containerID="a04a034f03fbbcb6f3d5ed51f0bcc1e0ce20aa585931074ac54c76dc3e0414b4" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.602812 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a04a034f03fbbcb6f3d5ed51f0bcc1e0ce20aa585931074ac54c76dc3e0414b4"} err="failed to get container status \"a04a034f03fbbcb6f3d5ed51f0bcc1e0ce20aa585931074ac54c76dc3e0414b4\": rpc error: code = NotFound desc = could not find container \"a04a034f03fbbcb6f3d5ed51f0bcc1e0ce20aa585931074ac54c76dc3e0414b4\": container with ID starting with a04a034f03fbbcb6f3d5ed51f0bcc1e0ce20aa585931074ac54c76dc3e0414b4 not found: ID does not exist" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.602856 4778 scope.go:117] "RemoveContainer" containerID="3390110f6f508f3f2d9a45c814fe3d10854f2eeccc5a2c4d21a6c3d3c0435701" Dec 05 16:52:49 crc kubenswrapper[4778]: E1205 16:52:49.603055 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3390110f6f508f3f2d9a45c814fe3d10854f2eeccc5a2c4d21a6c3d3c0435701\": container with ID starting with 3390110f6f508f3f2d9a45c814fe3d10854f2eeccc5a2c4d21a6c3d3c0435701 not found: ID does not exist" containerID="3390110f6f508f3f2d9a45c814fe3d10854f2eeccc5a2c4d21a6c3d3c0435701" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.603080 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3390110f6f508f3f2d9a45c814fe3d10854f2eeccc5a2c4d21a6c3d3c0435701"} err="failed to get container status \"3390110f6f508f3f2d9a45c814fe3d10854f2eeccc5a2c4d21a6c3d3c0435701\": rpc error: code = NotFound desc = could not find container \"3390110f6f508f3f2d9a45c814fe3d10854f2eeccc5a2c4d21a6c3d3c0435701\": container with ID starting with 3390110f6f508f3f2d9a45c814fe3d10854f2eeccc5a2c4d21a6c3d3c0435701 not found: ID does not exist" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.612908 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-hc7pn_862bfade-07c6-405c-bbca-e96341188a5c/manager/0.log" Dec 05 16:52:49 crc kubenswrapper[4778]: I1205 16:52:49.714098 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-mfkct_33e87b02-f57e-47bb-934b-159e59f2d7f5/manager/0.log" Dec 05 16:52:50 crc kubenswrapper[4778]: I1205 16:52:50.026546 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5dd64f5b5b-rmlwm_94682128-df97-406d-8947-1e8dd8199a99/manager/0.log" Dec 05 16:52:50 crc kubenswrapper[4778]: I1205 16:52:50.046138 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-index-fwgv5_4f97e6d4-7346-4696-901f-eb6822513707/registry-server/0.log" Dec 05 16:52:51 crc kubenswrapper[4778]: I1205 16:52:51.258113 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" path="/var/lib/kubelet/pods/a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0/volumes" Dec 05 16:52:51 crc kubenswrapper[4778]: I1205 16:52:51.744431 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:52:51 crc kubenswrapper[4778]: I1205 16:52:51.744468 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:52:51 crc kubenswrapper[4778]: I1205 16:52:51.745181 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:52:51 crc kubenswrapper[4778]: E1205 16:52:51.745462 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:53:05 crc kubenswrapper[4778]: I1205 16:53:05.250489 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:53:05 crc kubenswrapper[4778]: E1205 16:53:05.251236 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:53:10 crc kubenswrapper[4778]: I1205 16:53:10.284243 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8gjtp_cac1e607-7735-4696-9666-34cb5ecb4857/control-plane-machine-set-operator/0.log" Dec 05 16:53:10 crc kubenswrapper[4778]: I1205 16:53:10.549239 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-27smd_f859bdf8-f651-407a-a6b8-6c3ae2fe7f63/machine-api-operator/0.log" Dec 05 16:53:10 crc kubenswrapper[4778]: I1205 16:53:10.553874 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-27smd_f859bdf8-f651-407a-a6b8-6c3ae2fe7f63/kube-rbac-proxy/0.log" Dec 05 16:53:16 crc kubenswrapper[4778]: I1205 16:53:16.250166 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:53:16 crc kubenswrapper[4778]: E1205 16:53:16.251030 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:53:23 crc kubenswrapper[4778]: I1205 16:53:23.884768 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-j8xvc_db44b5fa-a7f8-4aac-bf9a-5669e0fad581/cert-manager-controller/0.log" Dec 05 16:53:24 crc kubenswrapper[4778]: I1205 16:53:24.079382 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-f6qgw_21fa2092-e29f-4bf1-b446-1561e7c2c35b/cert-manager-cainjector/0.log" Dec 05 16:53:24 crc kubenswrapper[4778]: I1205 16:53:24.177958 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-stl96_81ecb6f0-ea99-4c8f-802f-eec53d6c0eb0/cert-manager-webhook/0.log" Dec 05 16:53:28 crc kubenswrapper[4778]: I1205 16:53:28.249120 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:53:28 crc kubenswrapper[4778]: E1205 16:53:28.250039 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:53:37 crc kubenswrapper[4778]: I1205 16:53:37.805711 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-w494m_4cbfbce7-cb93-4e17-8e3a-688a04322274/nmstate-console-plugin/0.log" Dec 05 16:53:38 crc kubenswrapper[4778]: I1205 16:53:38.097699 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mms85_63520720-3a3d-44da-b13c-7406c45a6d50/nmstate-handler/0.log" Dec 05 16:53:38 crc kubenswrapper[4778]: I1205 16:53:38.171798 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-4zkxj_63e85179-39a5-418e-abe3-91a9a9a276e3/kube-rbac-proxy/0.log" Dec 05 16:53:38 crc kubenswrapper[4778]: I1205 16:53:38.240713 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-4zkxj_63e85179-39a5-418e-abe3-91a9a9a276e3/nmstate-metrics/0.log" Dec 05 16:53:38 crc kubenswrapper[4778]: I1205 16:53:38.404885 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-qjjrk_530e55ef-5024-4ab8-9072-709ca49ddc13/nmstate-operator/0.log" Dec 05 16:53:38 crc kubenswrapper[4778]: I1205 16:53:38.462391 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-64vbd_585193cc-6d97-46b0-95be-7ae7fffb2d11/nmstate-webhook/0.log" Dec 05 16:53:43 crc kubenswrapper[4778]: I1205 16:53:43.255685 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:53:43 crc kubenswrapper[4778]: E1205 16:53:43.256484 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:53:54 crc kubenswrapper[4778]: I1205 16:53:54.259222 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-ld99k_687c889e-852e-49f7-a18f-4992370c1829/kube-rbac-proxy/0.log" Dec 05 16:53:54 crc kubenswrapper[4778]: I1205 16:53:54.348350 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-ld99k_687c889e-852e-49f7-a18f-4992370c1829/controller/0.log" Dec 05 16:53:54 crc kubenswrapper[4778]: I1205 16:53:54.456023 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/cp-frr-files/0.log" Dec 05 16:53:54 crc kubenswrapper[4778]: I1205 16:53:54.777645 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/cp-frr-files/0.log" Dec 05 16:53:54 crc kubenswrapper[4778]: I1205 16:53:54.805198 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/cp-reloader/0.log" Dec 05 16:53:54 crc kubenswrapper[4778]: I1205 16:53:54.830085 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/cp-metrics/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.027983 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/cp-reloader/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.206955 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/cp-reloader/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.231103 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/cp-metrics/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.252710 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/cp-metrics/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.300182 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/cp-frr-files/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.479147 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/cp-metrics/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.479901 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/cp-reloader/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.508274 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/cp-frr-files/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.524891 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/controller/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.709577 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/kube-rbac-proxy/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.769254 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/kube-rbac-proxy-frr/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.778464 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/frr-metrics/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.965383 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-nz7bg_518d4b75-d756-48af-80a4-26e23ff4507b/frr-k8s-webhook-server/0.log" Dec 05 16:53:55 crc kubenswrapper[4778]: I1205 16:53:55.983918 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/reloader/0.log" Dec 05 16:53:56 crc kubenswrapper[4778]: I1205 16:53:56.261898 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-78b6b5f7d4-wmnmr_e5f9a9c1-e0f5-4b56-b66e-c70686a83d57/manager/0.log" Dec 05 16:53:56 crc kubenswrapper[4778]: I1205 16:53:56.415580 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5lwxz_c86fd8e4-7fe7-4e64-adb0-eb7f82b243c6/frr/0.log" Dec 05 16:53:56 crc kubenswrapper[4778]: I1205 16:53:56.445404 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-695c4fbf6f-4nkqk_54d90db9-7dd5-4012-9dd8-1fd612a4db38/webhook-server/0.log" Dec 05 16:53:56 crc kubenswrapper[4778]: I1205 16:53:56.481605 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qnv6q_82af16e8-7640-4de9-bd7c-baaf968f7a98/kube-rbac-proxy/0.log" Dec 05 16:53:56 crc kubenswrapper[4778]: I1205 16:53:56.769354 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qnv6q_82af16e8-7640-4de9-bd7c-baaf968f7a98/speaker/0.log" Dec 05 16:53:57 crc kubenswrapper[4778]: I1205 16:53:57.249603 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:53:57 crc kubenswrapper[4778]: E1205 16:53:57.249934 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:54:03 crc kubenswrapper[4778]: I1205 16:54:03.414138 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:54:03 crc kubenswrapper[4778]: I1205 16:54:03.414762 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:54:12 crc kubenswrapper[4778]: I1205 16:54:12.250177 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:54:12 crc kubenswrapper[4778]: E1205 16:54:12.251061 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:54:22 crc kubenswrapper[4778]: I1205 16:54:22.629243 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_02c25bd5-13b2-447c-ba37-5aadb8a33da0/init-config-reloader/0.log" Dec 05 16:54:22 crc kubenswrapper[4778]: I1205 16:54:22.838824 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_02c25bd5-13b2-447c-ba37-5aadb8a33da0/alertmanager/0.log" Dec 05 16:54:22 crc kubenswrapper[4778]: I1205 16:54:22.857793 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_02c25bd5-13b2-447c-ba37-5aadb8a33da0/init-config-reloader/0.log" Dec 05 16:54:22 crc kubenswrapper[4778]: I1205 16:54:22.894106 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_02c25bd5-13b2-447c-ba37-5aadb8a33da0/config-reloader/0.log" Dec 05 16:54:23 crc kubenswrapper[4778]: I1205 16:54:23.033841 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_e8dbe352-c702-4ea0-abd4-a1caf21004cc/ceilometer-central-agent/0.log" Dec 05 16:54:23 crc kubenswrapper[4778]: I1205 16:54:23.136946 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_e8dbe352-c702-4ea0-abd4-a1caf21004cc/ceilometer-notification-agent/0.log" Dec 05 16:54:23 crc kubenswrapper[4778]: I1205 16:54:23.188133 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_e8dbe352-c702-4ea0-abd4-a1caf21004cc/proxy-httpd/0.log" Dec 05 16:54:23 crc kubenswrapper[4778]: I1205 16:54:23.281783 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_e8dbe352-c702-4ea0-abd4-a1caf21004cc/sg-core/0.log" Dec 05 16:54:23 crc kubenswrapper[4778]: I1205 16:54:23.545349 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_keystone-7dc59c9b94-xvw4w_fb21dc96-1d4b-4116-95d2-659fc1daa3cd/keystone-api/0.log" Dec 05 16:54:23 crc kubenswrapper[4778]: I1205 16:54:23.651004 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_kube-state-metrics-0_04eaf4e3-1e04-4041-9721-d2b1bfcb44c9/kube-state-metrics/0.log" Dec 05 16:54:23 crc kubenswrapper[4778]: I1205 16:54:23.925025 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_e6c7d544-fb70-43d0-a77f-db48a7a63582/mysql-bootstrap/0.log" Dec 05 16:54:24 crc kubenswrapper[4778]: I1205 16:54:24.357149 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_e6c7d544-fb70-43d0-a77f-db48a7a63582/mysql-bootstrap/0.log" Dec 05 16:54:24 crc kubenswrapper[4778]: I1205 16:54:24.406204 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_e6c7d544-fb70-43d0-a77f-db48a7a63582/galera/0.log" Dec 05 16:54:24 crc kubenswrapper[4778]: I1205 16:54:24.581695 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstackclient_ce8951ca-0e99-43c8-b7d5-7c85ae7ed0c8/openstackclient/0.log" Dec 05 16:54:24 crc kubenswrapper[4778]: I1205 16:54:24.775918 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_c124cdbe-b3e4-465b-8657-9b749da2e709/init-config-reloader/0.log" Dec 05 16:54:25 crc kubenswrapper[4778]: I1205 16:54:25.024393 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_c124cdbe-b3e4-465b-8657-9b749da2e709/prometheus/0.log" Dec 05 16:54:25 crc kubenswrapper[4778]: I1205 16:54:25.031640 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_c124cdbe-b3e4-465b-8657-9b749da2e709/init-config-reloader/0.log" Dec 05 16:54:25 crc kubenswrapper[4778]: I1205 16:54:25.088535 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_c124cdbe-b3e4-465b-8657-9b749da2e709/config-reloader/0.log" Dec 05 16:54:25 crc kubenswrapper[4778]: I1205 16:54:25.314415 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_f73e1f56-f326-4886-9a0d-8f72407ebeb6/setup-container/0.log" Dec 05 16:54:25 crc kubenswrapper[4778]: I1205 16:54:25.346182 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_c124cdbe-b3e4-465b-8657-9b749da2e709/thanos-sidecar/0.log" Dec 05 16:54:25 crc kubenswrapper[4778]: I1205 16:54:25.601978 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_f73e1f56-f326-4886-9a0d-8f72407ebeb6/rabbitmq/0.log" Dec 05 16:54:25 crc kubenswrapper[4778]: I1205 16:54:25.833681 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_f73e1f56-f326-4886-9a0d-8f72407ebeb6/setup-container/0.log" Dec 05 16:54:26 crc kubenswrapper[4778]: I1205 16:54:26.075087 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_daf89267-199f-4532-b4f7-a74fc2ef5425/setup-container/0.log" Dec 05 16:54:26 crc kubenswrapper[4778]: I1205 16:54:26.292786 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_daf89267-199f-4532-b4f7-a74fc2ef5425/setup-container/0.log" Dec 05 16:54:26 crc kubenswrapper[4778]: I1205 16:54:26.481542 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_daf89267-199f-4532-b4f7-a74fc2ef5425/rabbitmq/0.log" Dec 05 16:54:26 crc kubenswrapper[4778]: I1205 16:54:26.553608 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-3a5a-account-create-update-fljmj_2d13bc57-d3d4-4278-b9a9-c90937db910e/mariadb-account-create-update/0.log" Dec 05 16:54:26 crc kubenswrapper[4778]: I1205 16:54:26.718526 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-db-create-pghpr_60883c3e-aae4-43bc-86d2-83b6801b6201/mariadb-database-create/0.log" Dec 05 16:54:26 crc kubenswrapper[4778]: I1205 16:54:26.892433 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-api-0_7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4/watcher-api/0.log" Dec 05 16:54:26 crc kubenswrapper[4778]: I1205 16:54:26.995597 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-api-0_7a7ad552-9b47-4fc9-a3ca-4069a2a2b0b4/watcher-kuttl-api-log/0.log" Dec 05 16:54:27 crc kubenswrapper[4778]: I1205 16:54:27.078033 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-applier-0_819d647d-6170-4b57-a849-7d686ddf2d65/watcher-applier/0.log" Dec 05 16:54:27 crc kubenswrapper[4778]: I1205 16:54:27.225868 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-db-sync-jfd52_016cbd75-1b09-4c22-ad66-3f97406f16f7/watcher-kuttl-db-sync/0.log" Dec 05 16:54:27 crc kubenswrapper[4778]: I1205 16:54:27.249081 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:54:27 crc kubenswrapper[4778]: E1205 16:54:27.249307 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:54:27 crc kubenswrapper[4778]: I1205 16:54:27.316790 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b/watcher-decision-engine/6.log" Dec 05 16:54:27 crc kubenswrapper[4778]: I1205 16:54:27.477484 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b/watcher-decision-engine/6.log" Dec 05 16:54:33 crc kubenswrapper[4778]: I1205 16:54:33.414397 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:54:33 crc kubenswrapper[4778]: I1205 16:54:33.415267 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:54:37 crc kubenswrapper[4778]: I1205 16:54:37.863545 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_memcached-0_859e60ea-04a4-49b6-8d50-6268c41f8131/memcached/0.log" Dec 05 16:54:40 crc kubenswrapper[4778]: I1205 16:54:40.249910 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:54:40 crc kubenswrapper[4778]: E1205 16:54:40.250520 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:54:46 crc kubenswrapper[4778]: I1205 16:54:46.500129 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt_2210330e-e3d5-4777-ad4f-61aa0a94f73a/util/0.log" Dec 05 16:54:46 crc kubenswrapper[4778]: I1205 16:54:46.842690 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt_2210330e-e3d5-4777-ad4f-61aa0a94f73a/util/0.log" Dec 05 16:54:46 crc kubenswrapper[4778]: I1205 16:54:46.923040 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt_2210330e-e3d5-4777-ad4f-61aa0a94f73a/pull/0.log" Dec 05 16:54:46 crc kubenswrapper[4778]: I1205 16:54:46.934582 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt_2210330e-e3d5-4777-ad4f-61aa0a94f73a/pull/0.log" Dec 05 16:54:47 crc kubenswrapper[4778]: I1205 16:54:47.104359 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt_2210330e-e3d5-4777-ad4f-61aa0a94f73a/pull/0.log" Dec 05 16:54:47 crc kubenswrapper[4778]: I1205 16:54:47.167314 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt_2210330e-e3d5-4777-ad4f-61aa0a94f73a/util/0.log" Dec 05 16:54:47 crc kubenswrapper[4778]: I1205 16:54:47.248102 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acpjjt_2210330e-e3d5-4777-ad4f-61aa0a94f73a/extract/0.log" Dec 05 16:54:47 crc kubenswrapper[4778]: I1205 16:54:47.325425 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx_27a8b0f6-f04b-4ce5-a429-019e09e1c6b8/util/0.log" Dec 05 16:54:47 crc kubenswrapper[4778]: I1205 16:54:47.512074 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx_27a8b0f6-f04b-4ce5-a429-019e09e1c6b8/util/0.log" Dec 05 16:54:47 crc kubenswrapper[4778]: I1205 16:54:47.514608 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx_27a8b0f6-f04b-4ce5-a429-019e09e1c6b8/pull/0.log" Dec 05 16:54:47 crc kubenswrapper[4778]: I1205 16:54:47.558377 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx_27a8b0f6-f04b-4ce5-a429-019e09e1c6b8/pull/0.log" Dec 05 16:54:47 crc kubenswrapper[4778]: I1205 16:54:47.718075 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx_27a8b0f6-f04b-4ce5-a429-019e09e1c6b8/util/0.log" Dec 05 16:54:47 crc kubenswrapper[4778]: I1205 16:54:47.759565 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx_27a8b0f6-f04b-4ce5-a429-019e09e1c6b8/extract/0.log" Dec 05 16:54:47 crc kubenswrapper[4778]: I1205 16:54:47.786944 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fjw7cx_27a8b0f6-f04b-4ce5-a429-019e09e1c6b8/pull/0.log" Dec 05 16:54:47 crc kubenswrapper[4778]: I1205 16:54:47.940988 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn_4b63bba1-6d00-4e21-81b5-9c2573000afd/util/0.log" Dec 05 16:54:48 crc kubenswrapper[4778]: I1205 16:54:48.164554 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn_4b63bba1-6d00-4e21-81b5-9c2573000afd/util/0.log" Dec 05 16:54:48 crc kubenswrapper[4778]: I1205 16:54:48.172047 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn_4b63bba1-6d00-4e21-81b5-9c2573000afd/pull/0.log" Dec 05 16:54:48 crc kubenswrapper[4778]: I1205 16:54:48.206351 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn_4b63bba1-6d00-4e21-81b5-9c2573000afd/pull/0.log" Dec 05 16:54:48 crc kubenswrapper[4778]: I1205 16:54:48.385281 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn_4b63bba1-6d00-4e21-81b5-9c2573000afd/pull/0.log" Dec 05 16:54:48 crc kubenswrapper[4778]: I1205 16:54:48.387332 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn_4b63bba1-6d00-4e21-81b5-9c2573000afd/util/0.log" Dec 05 16:54:48 crc kubenswrapper[4778]: I1205 16:54:48.413330 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jk8sn_4b63bba1-6d00-4e21-81b5-9c2573000afd/extract/0.log" Dec 05 16:54:48 crc kubenswrapper[4778]: I1205 16:54:48.620962 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw_b44f9629-0529-4a0b-bb95-5690c110cc51/util/0.log" Dec 05 16:54:48 crc kubenswrapper[4778]: I1205 16:54:48.801652 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw_b44f9629-0529-4a0b-bb95-5690c110cc51/pull/0.log" Dec 05 16:54:48 crc kubenswrapper[4778]: I1205 16:54:48.815891 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw_b44f9629-0529-4a0b-bb95-5690c110cc51/util/0.log" Dec 05 16:54:48 crc kubenswrapper[4778]: I1205 16:54:48.860565 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw_b44f9629-0529-4a0b-bb95-5690c110cc51/pull/0.log" Dec 05 16:54:49 crc kubenswrapper[4778]: I1205 16:54:49.081464 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw_b44f9629-0529-4a0b-bb95-5690c110cc51/util/0.log" Dec 05 16:54:49 crc kubenswrapper[4778]: I1205 16:54:49.089854 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw_b44f9629-0529-4a0b-bb95-5690c110cc51/extract/0.log" Dec 05 16:54:49 crc kubenswrapper[4778]: I1205 16:54:49.169261 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qpvgw_b44f9629-0529-4a0b-bb95-5690c110cc51/pull/0.log" Dec 05 16:54:49 crc kubenswrapper[4778]: I1205 16:54:49.283268 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v5hfn_193ca5ff-dd38-4ec6-aff7-3a43a42a12d9/extract-utilities/0.log" Dec 05 16:54:49 crc kubenswrapper[4778]: I1205 16:54:49.524849 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v5hfn_193ca5ff-dd38-4ec6-aff7-3a43a42a12d9/extract-content/0.log" Dec 05 16:54:49 crc kubenswrapper[4778]: I1205 16:54:49.545192 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v5hfn_193ca5ff-dd38-4ec6-aff7-3a43a42a12d9/extract-utilities/0.log" Dec 05 16:54:49 crc kubenswrapper[4778]: I1205 16:54:49.557158 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v5hfn_193ca5ff-dd38-4ec6-aff7-3a43a42a12d9/extract-content/0.log" Dec 05 16:54:49 crc kubenswrapper[4778]: I1205 16:54:49.890086 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v5hfn_193ca5ff-dd38-4ec6-aff7-3a43a42a12d9/extract-utilities/0.log" Dec 05 16:54:49 crc kubenswrapper[4778]: I1205 16:54:49.939460 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v5hfn_193ca5ff-dd38-4ec6-aff7-3a43a42a12d9/extract-content/0.log" Dec 05 16:54:50 crc kubenswrapper[4778]: I1205 16:54:50.128491 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pqhjb_8101c50d-ea04-4fc7-a438-951874cc0351/extract-utilities/0.log" Dec 05 16:54:50 crc kubenswrapper[4778]: I1205 16:54:50.334077 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pqhjb_8101c50d-ea04-4fc7-a438-951874cc0351/extract-content/0.log" Dec 05 16:54:50 crc kubenswrapper[4778]: I1205 16:54:50.340847 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pqhjb_8101c50d-ea04-4fc7-a438-951874cc0351/extract-utilities/0.log" Dec 05 16:54:50 crc kubenswrapper[4778]: I1205 16:54:50.340980 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pqhjb_8101c50d-ea04-4fc7-a438-951874cc0351/extract-content/0.log" Dec 05 16:54:50 crc kubenswrapper[4778]: I1205 16:54:50.545185 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pqhjb_8101c50d-ea04-4fc7-a438-951874cc0351/extract-content/0.log" Dec 05 16:54:50 crc kubenswrapper[4778]: I1205 16:54:50.643189 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v5hfn_193ca5ff-dd38-4ec6-aff7-3a43a42a12d9/registry-server/0.log" Dec 05 16:54:50 crc kubenswrapper[4778]: I1205 16:54:50.652404 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pqhjb_8101c50d-ea04-4fc7-a438-951874cc0351/extract-utilities/0.log" Dec 05 16:54:50 crc kubenswrapper[4778]: I1205 16:54:50.811684 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lc9sh_bddf2447-16af-4f91-ab3b-1d910c27027a/marketplace-operator/0.log" Dec 05 16:54:50 crc kubenswrapper[4778]: I1205 16:54:50.934601 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8kfvb_6e1fc109-ed5c-4277-8800-a02b2c92cd1c/extract-utilities/0.log" Dec 05 16:54:51 crc kubenswrapper[4778]: I1205 16:54:51.045201 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8kfvb_6e1fc109-ed5c-4277-8800-a02b2c92cd1c/extract-utilities/0.log" Dec 05 16:54:51 crc kubenswrapper[4778]: I1205 16:54:51.115147 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8kfvb_6e1fc109-ed5c-4277-8800-a02b2c92cd1c/extract-content/0.log" Dec 05 16:54:51 crc kubenswrapper[4778]: I1205 16:54:51.187167 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8kfvb_6e1fc109-ed5c-4277-8800-a02b2c92cd1c/extract-content/0.log" Dec 05 16:54:51 crc kubenswrapper[4778]: I1205 16:54:51.332200 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pqhjb_8101c50d-ea04-4fc7-a438-951874cc0351/registry-server/0.log" Dec 05 16:54:51 crc kubenswrapper[4778]: I1205 16:54:51.419092 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8kfvb_6e1fc109-ed5c-4277-8800-a02b2c92cd1c/extract-content/0.log" Dec 05 16:54:51 crc kubenswrapper[4778]: I1205 16:54:51.534944 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8kfvb_6e1fc109-ed5c-4277-8800-a02b2c92cd1c/registry-server/0.log" Dec 05 16:54:51 crc kubenswrapper[4778]: I1205 16:54:51.562483 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8kfvb_6e1fc109-ed5c-4277-8800-a02b2c92cd1c/extract-utilities/0.log" Dec 05 16:54:51 crc kubenswrapper[4778]: I1205 16:54:51.580581 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-whqgf_fe3bc4f1-bfa2-4b56-a478-8718c3c15bef/extract-utilities/0.log" Dec 05 16:54:51 crc kubenswrapper[4778]: I1205 16:54:51.816142 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-whqgf_fe3bc4f1-bfa2-4b56-a478-8718c3c15bef/extract-utilities/0.log" Dec 05 16:54:51 crc kubenswrapper[4778]: I1205 16:54:51.824472 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-whqgf_fe3bc4f1-bfa2-4b56-a478-8718c3c15bef/extract-content/0.log" Dec 05 16:54:51 crc kubenswrapper[4778]: I1205 16:54:51.862971 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-whqgf_fe3bc4f1-bfa2-4b56-a478-8718c3c15bef/extract-content/0.log" Dec 05 16:54:52 crc kubenswrapper[4778]: I1205 16:54:52.056818 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-whqgf_fe3bc4f1-bfa2-4b56-a478-8718c3c15bef/extract-utilities/0.log" Dec 05 16:54:52 crc kubenswrapper[4778]: I1205 16:54:52.167080 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-whqgf_fe3bc4f1-bfa2-4b56-a478-8718c3c15bef/extract-content/0.log" Dec 05 16:54:52 crc kubenswrapper[4778]: I1205 16:54:52.248963 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:54:52 crc kubenswrapper[4778]: E1205 16:54:52.249287 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:54:52 crc kubenswrapper[4778]: I1205 16:54:52.568052 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-whqgf_fe3bc4f1-bfa2-4b56-a478-8718c3c15bef/registry-server/0.log" Dec 05 16:55:03 crc kubenswrapper[4778]: I1205 16:55:03.414494 4778 patch_prober.go:28] interesting pod/machine-config-daemon-jqrsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:55:03 crc kubenswrapper[4778]: I1205 16:55:03.415101 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:55:03 crc kubenswrapper[4778]: I1205 16:55:03.415153 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" Dec 05 16:55:03 crc kubenswrapper[4778]: I1205 16:55:03.415764 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450"} pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:55:03 crc kubenswrapper[4778]: I1205 16:55:03.415835 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerName="machine-config-daemon" containerID="cri-o://39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" gracePeriod=600 Dec 05 16:55:03 crc kubenswrapper[4778]: E1205 16:55:03.546487 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:55:03 crc kubenswrapper[4778]: I1205 16:55:03.621855 4778 generic.go:334] "Generic (PLEG): container finished" podID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" exitCode=0 Dec 05 16:55:03 crc kubenswrapper[4778]: I1205 16:55:03.621903 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerDied","Data":"39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450"} Dec 05 16:55:03 crc kubenswrapper[4778]: I1205 16:55:03.621941 4778 scope.go:117] "RemoveContainer" containerID="5e27f38bc69e604af8eadcc3f113811915e11ea2ac4830d8c03bf5bb2b0e0f8f" Dec 05 16:55:03 crc kubenswrapper[4778]: I1205 16:55:03.622748 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:55:03 crc kubenswrapper[4778]: E1205 16:55:03.623200 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:55:04 crc kubenswrapper[4778]: I1205 16:55:04.249877 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:55:04 crc kubenswrapper[4778]: E1205 16:55:04.250336 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:55:05 crc kubenswrapper[4778]: I1205 16:55:05.946618 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-phf8p_7d00eb5e-6107-4c91-b9ed-540833e16404/prometheus-operator/0.log" Dec 05 16:55:06 crc kubenswrapper[4778]: I1205 16:55:06.087259 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6c9fd54565-kxj7v_22a6d7c0-787b-4db2-b559-a15f91626619/prometheus-operator-admission-webhook/0.log" Dec 05 16:55:06 crc kubenswrapper[4778]: I1205 16:55:06.162895 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6c9fd54565-txlnf_06f22ef3-dbf4-44cd-bd3c-099d4b23c440/prometheus-operator-admission-webhook/0.log" Dec 05 16:55:06 crc kubenswrapper[4778]: I1205 16:55:06.251857 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-5xcd4_8e816989-62eb-47ce-a33b-2f09f1d2b3c6/operator/0.log" Dec 05 16:55:06 crc kubenswrapper[4778]: I1205 16:55:06.324288 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-lk9wb_66d01458-bc30-463f-b7e3-6e20bb4ea267/observability-ui-dashboards/0.log" Dec 05 16:55:06 crc kubenswrapper[4778]: I1205 16:55:06.442427 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-nvd8k_da29284c-59ce-4aed-a280-0b9b550c2c96/perses-operator/0.log" Dec 05 16:55:15 crc kubenswrapper[4778]: I1205 16:55:15.251168 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:55:15 crc kubenswrapper[4778]: E1205 16:55:15.254056 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:55:17 crc kubenswrapper[4778]: I1205 16:55:17.249168 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:55:17 crc kubenswrapper[4778]: E1205 16:55:17.249655 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:55:29 crc kubenswrapper[4778]: I1205 16:55:29.250331 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:55:29 crc kubenswrapper[4778]: E1205 16:55:29.251249 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:55:32 crc kubenswrapper[4778]: I1205 16:55:32.249785 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:55:32 crc kubenswrapper[4778]: E1205 16:55:32.250463 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:55:40 crc kubenswrapper[4778]: I1205 16:55:40.249849 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:55:40 crc kubenswrapper[4778]: E1205 16:55:40.250572 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:55:43 crc kubenswrapper[4778]: I1205 16:55:43.256159 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:55:43 crc kubenswrapper[4778]: E1205 16:55:43.256936 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:55:55 crc kubenswrapper[4778]: I1205 16:55:55.256463 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:55:55 crc kubenswrapper[4778]: E1205 16:55:55.257405 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:55:57 crc kubenswrapper[4778]: I1205 16:55:57.252116 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:55:57 crc kubenswrapper[4778]: E1205 16:55:57.252759 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:56:07 crc kubenswrapper[4778]: I1205 16:56:07.173166 4778 generic.go:334] "Generic (PLEG): container finished" podID="2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3" containerID="be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2" exitCode=0 Dec 05 16:56:07 crc kubenswrapper[4778]: I1205 16:56:07.173454 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wx5k/must-gather-w578h" event={"ID":"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3","Type":"ContainerDied","Data":"be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2"} Dec 05 16:56:07 crc kubenswrapper[4778]: I1205 16:56:07.174146 4778 scope.go:117] "RemoveContainer" containerID="be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2" Dec 05 16:56:08 crc kubenswrapper[4778]: I1205 16:56:08.141608 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5wx5k_must-gather-w578h_2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3/gather/0.log" Dec 05 16:56:08 crc kubenswrapper[4778]: I1205 16:56:08.249520 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:56:08 crc kubenswrapper[4778]: E1205 16:56:08.249819 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:56:10 crc kubenswrapper[4778]: I1205 16:56:10.067639 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-pghpr"] Dec 05 16:56:10 crc kubenswrapper[4778]: I1205 16:56:10.079707 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj"] Dec 05 16:56:10 crc kubenswrapper[4778]: I1205 16:56:10.093309 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-pghpr"] Dec 05 16:56:10 crc kubenswrapper[4778]: I1205 16:56:10.101530 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-3a5a-account-create-update-fljmj"] Dec 05 16:56:11 crc kubenswrapper[4778]: I1205 16:56:11.250041 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:56:11 crc kubenswrapper[4778]: E1205 16:56:11.250466 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:56:11 crc kubenswrapper[4778]: I1205 16:56:11.262228 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d13bc57-d3d4-4278-b9a9-c90937db910e" path="/var/lib/kubelet/pods/2d13bc57-d3d4-4278-b9a9-c90937db910e/volumes" Dec 05 16:56:11 crc kubenswrapper[4778]: I1205 16:56:11.263007 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60883c3e-aae4-43bc-86d2-83b6801b6201" path="/var/lib/kubelet/pods/60883c3e-aae4-43bc-86d2-83b6801b6201/volumes" Dec 05 16:56:15 crc kubenswrapper[4778]: I1205 16:56:15.532164 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5wx5k/must-gather-w578h"] Dec 05 16:56:15 crc kubenswrapper[4778]: I1205 16:56:15.533179 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5wx5k/must-gather-w578h" podUID="2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3" containerName="copy" containerID="cri-o://4393d61bb90671a6004faddece9e64fbe19b40e8fecb97f67e6846a13c8a6fc7" gracePeriod=2 Dec 05 16:56:15 crc kubenswrapper[4778]: I1205 16:56:15.542228 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5wx5k/must-gather-w578h"] Dec 05 16:56:15 crc kubenswrapper[4778]: I1205 16:56:15.902528 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5wx5k_must-gather-w578h_2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3/copy/0.log" Dec 05 16:56:15 crc kubenswrapper[4778]: I1205 16:56:15.903221 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wx5k/must-gather-w578h" Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.096033 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3-must-gather-output\") pod \"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3\" (UID: \"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3\") " Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.096439 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cdfg\" (UniqueName: \"kubernetes.io/projected/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3-kube-api-access-7cdfg\") pod \"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3\" (UID: \"2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3\") " Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.100958 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3-kube-api-access-7cdfg" (OuterVolumeSpecName: "kube-api-access-7cdfg") pod "2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3" (UID: "2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3"). InnerVolumeSpecName "kube-api-access-7cdfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.198094 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cdfg\" (UniqueName: \"kubernetes.io/projected/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3-kube-api-access-7cdfg\") on node \"crc\" DevicePath \"\"" Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.200213 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3" (UID: "2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.258728 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5wx5k_must-gather-w578h_2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3/copy/0.log" Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.259106 4778 generic.go:334] "Generic (PLEG): container finished" podID="2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3" containerID="4393d61bb90671a6004faddece9e64fbe19b40e8fecb97f67e6846a13c8a6fc7" exitCode=143 Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.259154 4778 scope.go:117] "RemoveContainer" containerID="4393d61bb90671a6004faddece9e64fbe19b40e8fecb97f67e6846a13c8a6fc7" Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.259314 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wx5k/must-gather-w578h" Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.286486 4778 scope.go:117] "RemoveContainer" containerID="be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2" Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.299976 4778 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.352763 4778 scope.go:117] "RemoveContainer" containerID="4393d61bb90671a6004faddece9e64fbe19b40e8fecb97f67e6846a13c8a6fc7" Dec 05 16:56:16 crc kubenswrapper[4778]: E1205 16:56:16.353616 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4393d61bb90671a6004faddece9e64fbe19b40e8fecb97f67e6846a13c8a6fc7\": container with ID starting with 4393d61bb90671a6004faddece9e64fbe19b40e8fecb97f67e6846a13c8a6fc7 not found: ID does not exist" containerID="4393d61bb90671a6004faddece9e64fbe19b40e8fecb97f67e6846a13c8a6fc7" Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.353723 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4393d61bb90671a6004faddece9e64fbe19b40e8fecb97f67e6846a13c8a6fc7"} err="failed to get container status \"4393d61bb90671a6004faddece9e64fbe19b40e8fecb97f67e6846a13c8a6fc7\": rpc error: code = NotFound desc = could not find container \"4393d61bb90671a6004faddece9e64fbe19b40e8fecb97f67e6846a13c8a6fc7\": container with ID starting with 4393d61bb90671a6004faddece9e64fbe19b40e8fecb97f67e6846a13c8a6fc7 not found: ID does not exist" Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.353805 4778 scope.go:117] "RemoveContainer" containerID="be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2" Dec 05 16:56:16 crc kubenswrapper[4778]: E1205 16:56:16.354426 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2\": container with ID starting with be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2 not found: ID does not exist" containerID="be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2" Dec 05 16:56:16 crc kubenswrapper[4778]: I1205 16:56:16.354474 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2"} err="failed to get container status \"be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2\": rpc error: code = NotFound desc = could not find container \"be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2\": container with ID starting with be7aaf96d084d8def3ef9c607efb88086f3390f589224860a05a2c0ebab288c2 not found: ID does not exist" Dec 05 16:56:17 crc kubenswrapper[4778]: I1205 16:56:17.259519 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3" path="/var/lib/kubelet/pods/2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3/volumes" Dec 05 16:56:21 crc kubenswrapper[4778]: I1205 16:56:21.038131 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-jfd52"] Dec 05 16:56:21 crc kubenswrapper[4778]: I1205 16:56:21.053362 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-jfd52"] Dec 05 16:56:21 crc kubenswrapper[4778]: I1205 16:56:21.262003 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="016cbd75-1b09-4c22-ad66-3f97406f16f7" path="/var/lib/kubelet/pods/016cbd75-1b09-4c22-ad66-3f97406f16f7/volumes" Dec 05 16:56:22 crc kubenswrapper[4778]: I1205 16:56:22.249664 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:56:22 crc kubenswrapper[4778]: E1205 16:56:22.249900 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:56:23 crc kubenswrapper[4778]: I1205 16:56:23.285819 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:56:23 crc kubenswrapper[4778]: E1205 16:56:23.286695 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:56:33 crc kubenswrapper[4778]: I1205 16:56:33.263232 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:56:33 crc kubenswrapper[4778]: E1205 16:56:33.264630 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:56:36 crc kubenswrapper[4778]: I1205 16:56:36.249752 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:56:36 crc kubenswrapper[4778]: E1205 16:56:36.250792 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:56:44 crc kubenswrapper[4778]: I1205 16:56:44.250052 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:56:44 crc kubenswrapper[4778]: E1205 16:56:44.250609 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.617549 4778 scope.go:117] "RemoveContainer" containerID="a3e7eb7460c4450190fa92a0d359ff7a3f0dc1f787fb0d7f3d5fb00a19b2ad74" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.658754 4778 scope.go:117] "RemoveContainer" containerID="adb5a97632ccfb50784f527b07b991250325245a40aecc1819c8fe4d216c3e55" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.707405 4778 scope.go:117] "RemoveContainer" containerID="d1e63dd94bea9419c5ddb14850476f117c62b99f060b15aa49ef4c283dae89d3" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.952895 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dhb96"] Dec 05 16:56:49 crc kubenswrapper[4778]: E1205 16:56:49.953413 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3" containerName="copy" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.953429 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3" containerName="copy" Dec 05 16:56:49 crc kubenswrapper[4778]: E1205 16:56:49.953447 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" containerName="registry-server" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.953454 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" containerName="registry-server" Dec 05 16:56:49 crc kubenswrapper[4778]: E1205 16:56:49.953472 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" containerName="extract-utilities" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.953484 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" containerName="extract-utilities" Dec 05 16:56:49 crc kubenswrapper[4778]: E1205 16:56:49.953494 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" containerName="extract-content" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.953501 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" containerName="extract-content" Dec 05 16:56:49 crc kubenswrapper[4778]: E1205 16:56:49.953520 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3" containerName="gather" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.953527 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3" containerName="gather" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.953745 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3" containerName="gather" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.953771 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c5ad118-bf9e-49cc-abb9-2c08bcb12ef3" containerName="copy" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.953783 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75fb65a-6b82-4afe-9c4f-264b0a4fc4c0" containerName="registry-server" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.955301 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:56:49 crc kubenswrapper[4778]: I1205 16:56:49.962034 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhb96"] Dec 05 16:56:50 crc kubenswrapper[4778]: I1205 16:56:50.072318 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ef62522-610f-4203-8446-ae937ed73c9f-utilities\") pod \"redhat-marketplace-dhb96\" (UID: \"0ef62522-610f-4203-8446-ae937ed73c9f\") " pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:56:50 crc kubenswrapper[4778]: I1205 16:56:50.072449 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ef62522-610f-4203-8446-ae937ed73c9f-catalog-content\") pod \"redhat-marketplace-dhb96\" (UID: \"0ef62522-610f-4203-8446-ae937ed73c9f\") " pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:56:50 crc kubenswrapper[4778]: I1205 16:56:50.072495 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4z45\" (UniqueName: \"kubernetes.io/projected/0ef62522-610f-4203-8446-ae937ed73c9f-kube-api-access-m4z45\") pod \"redhat-marketplace-dhb96\" (UID: \"0ef62522-610f-4203-8446-ae937ed73c9f\") " pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:56:50 crc kubenswrapper[4778]: I1205 16:56:50.174084 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ef62522-610f-4203-8446-ae937ed73c9f-utilities\") pod \"redhat-marketplace-dhb96\" (UID: \"0ef62522-610f-4203-8446-ae937ed73c9f\") " pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:56:50 crc kubenswrapper[4778]: I1205 16:56:50.174164 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ef62522-610f-4203-8446-ae937ed73c9f-catalog-content\") pod \"redhat-marketplace-dhb96\" (UID: \"0ef62522-610f-4203-8446-ae937ed73c9f\") " pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:56:50 crc kubenswrapper[4778]: I1205 16:56:50.174201 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4z45\" (UniqueName: \"kubernetes.io/projected/0ef62522-610f-4203-8446-ae937ed73c9f-kube-api-access-m4z45\") pod \"redhat-marketplace-dhb96\" (UID: \"0ef62522-610f-4203-8446-ae937ed73c9f\") " pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:56:50 crc kubenswrapper[4778]: I1205 16:56:50.174791 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ef62522-610f-4203-8446-ae937ed73c9f-catalog-content\") pod \"redhat-marketplace-dhb96\" (UID: \"0ef62522-610f-4203-8446-ae937ed73c9f\") " pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:56:50 crc kubenswrapper[4778]: I1205 16:56:50.175093 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ef62522-610f-4203-8446-ae937ed73c9f-utilities\") pod \"redhat-marketplace-dhb96\" (UID: \"0ef62522-610f-4203-8446-ae937ed73c9f\") " pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:56:50 crc kubenswrapper[4778]: I1205 16:56:50.201304 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4z45\" (UniqueName: \"kubernetes.io/projected/0ef62522-610f-4203-8446-ae937ed73c9f-kube-api-access-m4z45\") pod \"redhat-marketplace-dhb96\" (UID: \"0ef62522-610f-4203-8446-ae937ed73c9f\") " pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:56:50 crc kubenswrapper[4778]: I1205 16:56:50.249411 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:56:50 crc kubenswrapper[4778]: E1205 16:56:50.249673 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:56:50 crc kubenswrapper[4778]: I1205 16:56:50.302050 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:56:50 crc kubenswrapper[4778]: I1205 16:56:50.758006 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhb96"] Dec 05 16:56:50 crc kubenswrapper[4778]: W1205 16:56:50.767507 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ef62522_610f_4203_8446_ae937ed73c9f.slice/crio-6f9d952d114ae3215a1e23eaf69dd509a5980a4d748825be2c115b9b0a2ceb8e WatchSource:0}: Error finding container 6f9d952d114ae3215a1e23eaf69dd509a5980a4d748825be2c115b9b0a2ceb8e: Status 404 returned error can't find the container with id 6f9d952d114ae3215a1e23eaf69dd509a5980a4d748825be2c115b9b0a2ceb8e Dec 05 16:56:51 crc kubenswrapper[4778]: I1205 16:56:51.543556 4778 generic.go:334] "Generic (PLEG): container finished" podID="0ef62522-610f-4203-8446-ae937ed73c9f" containerID="a1e11928f6482157f0c26129effbfeb7c9a636b80d710bc233029d76bdc099bc" exitCode=0 Dec 05 16:56:51 crc kubenswrapper[4778]: I1205 16:56:51.543602 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhb96" event={"ID":"0ef62522-610f-4203-8446-ae937ed73c9f","Type":"ContainerDied","Data":"a1e11928f6482157f0c26129effbfeb7c9a636b80d710bc233029d76bdc099bc"} Dec 05 16:56:51 crc kubenswrapper[4778]: I1205 16:56:51.543664 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhb96" event={"ID":"0ef62522-610f-4203-8446-ae937ed73c9f","Type":"ContainerStarted","Data":"6f9d952d114ae3215a1e23eaf69dd509a5980a4d748825be2c115b9b0a2ceb8e"} Dec 05 16:56:51 crc kubenswrapper[4778]: I1205 16:56:51.545411 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:56:52 crc kubenswrapper[4778]: I1205 16:56:52.553199 4778 generic.go:334] "Generic (PLEG): container finished" podID="0ef62522-610f-4203-8446-ae937ed73c9f" containerID="b0d158c4d3a7c8df112d5a52d546d31e72400e761a10763e53b1acbeb4adc2f3" exitCode=0 Dec 05 16:56:52 crc kubenswrapper[4778]: I1205 16:56:52.553289 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhb96" event={"ID":"0ef62522-610f-4203-8446-ae937ed73c9f","Type":"ContainerDied","Data":"b0d158c4d3a7c8df112d5a52d546d31e72400e761a10763e53b1acbeb4adc2f3"} Dec 05 16:56:53 crc kubenswrapper[4778]: I1205 16:56:53.562518 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhb96" event={"ID":"0ef62522-610f-4203-8446-ae937ed73c9f","Type":"ContainerStarted","Data":"454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4"} Dec 05 16:56:53 crc kubenswrapper[4778]: I1205 16:56:53.590046 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dhb96" podStartSLOduration=3.172208502 podStartE2EDuration="4.590026471s" podCreationTimestamp="2025-12-05 16:56:49 +0000 UTC" firstStartedPulling="2025-12-05 16:56:51.545109586 +0000 UTC m=+3698.648905966" lastFinishedPulling="2025-12-05 16:56:52.962927555 +0000 UTC m=+3700.066723935" observedRunningTime="2025-12-05 16:56:53.587407899 +0000 UTC m=+3700.691204309" watchObservedRunningTime="2025-12-05 16:56:53.590026471 +0000 UTC m=+3700.693822851" Dec 05 16:56:55 crc kubenswrapper[4778]: I1205 16:56:55.249926 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:56:55 crc kubenswrapper[4778]: E1205 16:56:55.250860 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:57:00 crc kubenswrapper[4778]: I1205 16:57:00.303054 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:57:00 crc kubenswrapper[4778]: I1205 16:57:00.303575 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:57:00 crc kubenswrapper[4778]: I1205 16:57:00.362534 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:57:00 crc kubenswrapper[4778]: I1205 16:57:00.695976 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:57:03 crc kubenswrapper[4778]: I1205 16:57:03.264537 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:57:03 crc kubenswrapper[4778]: E1205 16:57:03.265531 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:57:03 crc kubenswrapper[4778]: I1205 16:57:03.936088 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhb96"] Dec 05 16:57:03 crc kubenswrapper[4778]: I1205 16:57:03.936381 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dhb96" podUID="0ef62522-610f-4203-8446-ae937ed73c9f" containerName="registry-server" containerID="cri-o://454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4" gracePeriod=2 Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.497602 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.532074 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4z45\" (UniqueName: \"kubernetes.io/projected/0ef62522-610f-4203-8446-ae937ed73c9f-kube-api-access-m4z45\") pod \"0ef62522-610f-4203-8446-ae937ed73c9f\" (UID: \"0ef62522-610f-4203-8446-ae937ed73c9f\") " Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.532217 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ef62522-610f-4203-8446-ae937ed73c9f-utilities\") pod \"0ef62522-610f-4203-8446-ae937ed73c9f\" (UID: \"0ef62522-610f-4203-8446-ae937ed73c9f\") " Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.532347 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ef62522-610f-4203-8446-ae937ed73c9f-catalog-content\") pod \"0ef62522-610f-4203-8446-ae937ed73c9f\" (UID: \"0ef62522-610f-4203-8446-ae937ed73c9f\") " Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.542196 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ef62522-610f-4203-8446-ae937ed73c9f-utilities" (OuterVolumeSpecName: "utilities") pod "0ef62522-610f-4203-8446-ae937ed73c9f" (UID: "0ef62522-610f-4203-8446-ae937ed73c9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.551538 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ef62522-610f-4203-8446-ae937ed73c9f-kube-api-access-m4z45" (OuterVolumeSpecName: "kube-api-access-m4z45") pod "0ef62522-610f-4203-8446-ae937ed73c9f" (UID: "0ef62522-610f-4203-8446-ae937ed73c9f"). InnerVolumeSpecName "kube-api-access-m4z45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.556874 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ef62522-610f-4203-8446-ae937ed73c9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ef62522-610f-4203-8446-ae937ed73c9f" (UID: "0ef62522-610f-4203-8446-ae937ed73c9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.634671 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ef62522-610f-4203-8446-ae937ed73c9f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.634705 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4z45\" (UniqueName: \"kubernetes.io/projected/0ef62522-610f-4203-8446-ae937ed73c9f-kube-api-access-m4z45\") on node \"crc\" DevicePath \"\"" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.634716 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ef62522-610f-4203-8446-ae937ed73c9f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.678580 4778 generic.go:334] "Generic (PLEG): container finished" podID="0ef62522-610f-4203-8446-ae937ed73c9f" containerID="454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4" exitCode=0 Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.678618 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhb96" event={"ID":"0ef62522-610f-4203-8446-ae937ed73c9f","Type":"ContainerDied","Data":"454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4"} Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.678643 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhb96" event={"ID":"0ef62522-610f-4203-8446-ae937ed73c9f","Type":"ContainerDied","Data":"6f9d952d114ae3215a1e23eaf69dd509a5980a4d748825be2c115b9b0a2ceb8e"} Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.678660 4778 scope.go:117] "RemoveContainer" containerID="454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.678765 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhb96" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.712588 4778 scope.go:117] "RemoveContainer" containerID="b0d158c4d3a7c8df112d5a52d546d31e72400e761a10763e53b1acbeb4adc2f3" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.717020 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhb96"] Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.723813 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhb96"] Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.739197 4778 scope.go:117] "RemoveContainer" containerID="a1e11928f6482157f0c26129effbfeb7c9a636b80d710bc233029d76bdc099bc" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.768986 4778 scope.go:117] "RemoveContainer" containerID="454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4" Dec 05 16:57:05 crc kubenswrapper[4778]: E1205 16:57:05.769428 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4\": container with ID starting with 454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4 not found: ID does not exist" containerID="454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.769469 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4"} err="failed to get container status \"454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4\": rpc error: code = NotFound desc = could not find container \"454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4\": container with ID starting with 454e56da8c43f4743955850c1942f2b9bdfea379c5ff396ec164f7b84dbda4a4 not found: ID does not exist" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.769496 4778 scope.go:117] "RemoveContainer" containerID="b0d158c4d3a7c8df112d5a52d546d31e72400e761a10763e53b1acbeb4adc2f3" Dec 05 16:57:05 crc kubenswrapper[4778]: E1205 16:57:05.769827 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0d158c4d3a7c8df112d5a52d546d31e72400e761a10763e53b1acbeb4adc2f3\": container with ID starting with b0d158c4d3a7c8df112d5a52d546d31e72400e761a10763e53b1acbeb4adc2f3 not found: ID does not exist" containerID="b0d158c4d3a7c8df112d5a52d546d31e72400e761a10763e53b1acbeb4adc2f3" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.769860 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0d158c4d3a7c8df112d5a52d546d31e72400e761a10763e53b1acbeb4adc2f3"} err="failed to get container status \"b0d158c4d3a7c8df112d5a52d546d31e72400e761a10763e53b1acbeb4adc2f3\": rpc error: code = NotFound desc = could not find container \"b0d158c4d3a7c8df112d5a52d546d31e72400e761a10763e53b1acbeb4adc2f3\": container with ID starting with b0d158c4d3a7c8df112d5a52d546d31e72400e761a10763e53b1acbeb4adc2f3 not found: ID does not exist" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.769881 4778 scope.go:117] "RemoveContainer" containerID="a1e11928f6482157f0c26129effbfeb7c9a636b80d710bc233029d76bdc099bc" Dec 05 16:57:05 crc kubenswrapper[4778]: E1205 16:57:05.770096 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e11928f6482157f0c26129effbfeb7c9a636b80d710bc233029d76bdc099bc\": container with ID starting with a1e11928f6482157f0c26129effbfeb7c9a636b80d710bc233029d76bdc099bc not found: ID does not exist" containerID="a1e11928f6482157f0c26129effbfeb7c9a636b80d710bc233029d76bdc099bc" Dec 05 16:57:05 crc kubenswrapper[4778]: I1205 16:57:05.770141 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e11928f6482157f0c26129effbfeb7c9a636b80d710bc233029d76bdc099bc"} err="failed to get container status \"a1e11928f6482157f0c26129effbfeb7c9a636b80d710bc233029d76bdc099bc\": rpc error: code = NotFound desc = could not find container \"a1e11928f6482157f0c26129effbfeb7c9a636b80d710bc233029d76bdc099bc\": container with ID starting with a1e11928f6482157f0c26129effbfeb7c9a636b80d710bc233029d76bdc099bc not found: ID does not exist" Dec 05 16:57:07 crc kubenswrapper[4778]: I1205 16:57:07.261887 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ef62522-610f-4203-8446-ae937ed73c9f" path="/var/lib/kubelet/pods/0ef62522-610f-4203-8446-ae937ed73c9f/volumes" Dec 05 16:57:09 crc kubenswrapper[4778]: I1205 16:57:09.249693 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:57:09 crc kubenswrapper[4778]: E1205 16:57:09.249913 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:57:14 crc kubenswrapper[4778]: I1205 16:57:14.249158 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:57:14 crc kubenswrapper[4778]: E1205 16:57:14.249836 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:57:22 crc kubenswrapper[4778]: I1205 16:57:22.249306 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:57:22 crc kubenswrapper[4778]: E1205 16:57:22.250162 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:57:29 crc kubenswrapper[4778]: I1205 16:57:29.249477 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:57:29 crc kubenswrapper[4778]: E1205 16:57:29.250156 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:57:36 crc kubenswrapper[4778]: I1205 16:57:36.249517 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:57:36 crc kubenswrapper[4778]: E1205 16:57:36.250165 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:57:44 crc kubenswrapper[4778]: I1205 16:57:44.249936 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:57:44 crc kubenswrapper[4778]: E1205 16:57:44.250724 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:57:49 crc kubenswrapper[4778]: I1205 16:57:49.249998 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:57:51 crc kubenswrapper[4778]: I1205 16:57:51.098084 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerStarted","Data":"8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7"} Dec 05 16:57:51 crc kubenswrapper[4778]: I1205 16:57:51.744438 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:57:51 crc kubenswrapper[4778]: I1205 16:57:51.744839 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:57:51 crc kubenswrapper[4778]: I1205 16:57:51.780687 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:57:52 crc kubenswrapper[4778]: I1205 16:57:52.144517 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:57:53 crc kubenswrapper[4778]: I1205 16:57:53.127457 4778 generic.go:334] "Generic (PLEG): container finished" podID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" exitCode=1 Dec 05 16:57:53 crc kubenswrapper[4778]: I1205 16:57:53.127549 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b","Type":"ContainerDied","Data":"8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7"} Dec 05 16:57:53 crc kubenswrapper[4778]: I1205 16:57:53.127626 4778 scope.go:117] "RemoveContainer" containerID="0649190912bbe29cc380cc1a22b72b16eb5ccf84021f303a96f37c6e0607f778" Dec 05 16:57:53 crc kubenswrapper[4778]: I1205 16:57:53.128274 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:57:53 crc kubenswrapper[4778]: E1205 16:57:53.129129 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:57:54 crc kubenswrapper[4778]: I1205 16:57:54.137650 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:57:54 crc kubenswrapper[4778]: E1205 16:57:54.138080 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:57:55 crc kubenswrapper[4778]: I1205 16:57:55.249478 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:57:55 crc kubenswrapper[4778]: E1205 16:57:55.249782 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:58:01 crc kubenswrapper[4778]: I1205 16:58:01.744182 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:58:01 crc kubenswrapper[4778]: I1205 16:58:01.745524 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:58:01 crc kubenswrapper[4778]: E1205 16:58:01.745794 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:58:09 crc kubenswrapper[4778]: I1205 16:58:09.249237 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:58:09 crc kubenswrapper[4778]: E1205 16:58:09.249935 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:58:13 crc kubenswrapper[4778]: I1205 16:58:13.253988 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:58:13 crc kubenswrapper[4778]: E1205 16:58:13.254899 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:58:21 crc kubenswrapper[4778]: I1205 16:58:21.249227 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:58:21 crc kubenswrapper[4778]: E1205 16:58:21.250089 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:58:21 crc kubenswrapper[4778]: I1205 16:58:21.744709 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:58:21 crc kubenswrapper[4778]: I1205 16:58:21.745038 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 16:58:21 crc kubenswrapper[4778]: I1205 16:58:21.745448 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:58:21 crc kubenswrapper[4778]: E1205 16:58:21.745701 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:58:22 crc kubenswrapper[4778]: I1205 16:58:22.398044 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:58:22 crc kubenswrapper[4778]: E1205 16:58:22.398466 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:58:34 crc kubenswrapper[4778]: I1205 16:58:34.249540 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:58:34 crc kubenswrapper[4778]: E1205 16:58:34.250216 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:58:34 crc kubenswrapper[4778]: I1205 16:58:34.250356 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:58:34 crc kubenswrapper[4778]: E1205 16:58:34.250542 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:58:46 crc kubenswrapper[4778]: I1205 16:58:46.249492 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:58:46 crc kubenswrapper[4778]: I1205 16:58:46.250018 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:58:46 crc kubenswrapper[4778]: E1205 16:58:46.250193 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:58:46 crc kubenswrapper[4778]: E1205 16:58:46.250320 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:58:58 crc kubenswrapper[4778]: I1205 16:58:58.249972 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:58:58 crc kubenswrapper[4778]: E1205 16:58:58.250726 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:58:59 crc kubenswrapper[4778]: I1205 16:58:59.249678 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:58:59 crc kubenswrapper[4778]: E1205 16:58:59.249943 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:59:12 crc kubenswrapper[4778]: I1205 16:59:12.249582 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:59:12 crc kubenswrapper[4778]: E1205 16:59:12.251289 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:59:13 crc kubenswrapper[4778]: I1205 16:59:13.256512 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:59:13 crc kubenswrapper[4778]: E1205 16:59:13.256804 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:59:23 crc kubenswrapper[4778]: I1205 16:59:23.260043 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:59:23 crc kubenswrapper[4778]: E1205 16:59:23.261796 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:59:24 crc kubenswrapper[4778]: I1205 16:59:24.249336 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:59:24 crc kubenswrapper[4778]: E1205 16:59:24.250017 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:59:35 crc kubenswrapper[4778]: I1205 16:59:35.252388 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:59:35 crc kubenswrapper[4778]: E1205 16:59:35.253179 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:59:38 crc kubenswrapper[4778]: I1205 16:59:38.249501 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:59:38 crc kubenswrapper[4778]: E1205 16:59:38.250233 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 16:59:47 crc kubenswrapper[4778]: I1205 16:59:47.250395 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 16:59:47 crc kubenswrapper[4778]: E1205 16:59:47.251058 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 16:59:52 crc kubenswrapper[4778]: I1205 16:59:52.250549 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 16:59:52 crc kubenswrapper[4778]: E1205 16:59:52.251263 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.189636 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6"] Dec 05 17:00:00 crc kubenswrapper[4778]: E1205 17:00:00.190632 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ef62522-610f-4203-8446-ae937ed73c9f" containerName="extract-content" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.190651 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ef62522-610f-4203-8446-ae937ed73c9f" containerName="extract-content" Dec 05 17:00:00 crc kubenswrapper[4778]: E1205 17:00:00.190668 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ef62522-610f-4203-8446-ae937ed73c9f" containerName="extract-utilities" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.190679 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ef62522-610f-4203-8446-ae937ed73c9f" containerName="extract-utilities" Dec 05 17:00:00 crc kubenswrapper[4778]: E1205 17:00:00.190694 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ef62522-610f-4203-8446-ae937ed73c9f" containerName="registry-server" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.190704 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ef62522-610f-4203-8446-ae937ed73c9f" containerName="registry-server" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.190911 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ef62522-610f-4203-8446-ae937ed73c9f" containerName="registry-server" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.191653 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.194755 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.195381 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.204567 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6"] Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.301717 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24ccn\" (UniqueName: \"kubernetes.io/projected/798bc777-27bd-4c66-a700-a2fdf726fd2e-kube-api-access-24ccn\") pod \"collect-profiles-29415900-gtnx6\" (UID: \"798bc777-27bd-4c66-a700-a2fdf726fd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.301810 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/798bc777-27bd-4c66-a700-a2fdf726fd2e-secret-volume\") pod \"collect-profiles-29415900-gtnx6\" (UID: \"798bc777-27bd-4c66-a700-a2fdf726fd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.301893 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/798bc777-27bd-4c66-a700-a2fdf726fd2e-config-volume\") pod \"collect-profiles-29415900-gtnx6\" (UID: \"798bc777-27bd-4c66-a700-a2fdf726fd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.403019 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24ccn\" (UniqueName: \"kubernetes.io/projected/798bc777-27bd-4c66-a700-a2fdf726fd2e-kube-api-access-24ccn\") pod \"collect-profiles-29415900-gtnx6\" (UID: \"798bc777-27bd-4c66-a700-a2fdf726fd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.403100 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/798bc777-27bd-4c66-a700-a2fdf726fd2e-secret-volume\") pod \"collect-profiles-29415900-gtnx6\" (UID: \"798bc777-27bd-4c66-a700-a2fdf726fd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.403177 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/798bc777-27bd-4c66-a700-a2fdf726fd2e-config-volume\") pod \"collect-profiles-29415900-gtnx6\" (UID: \"798bc777-27bd-4c66-a700-a2fdf726fd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.404244 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/798bc777-27bd-4c66-a700-a2fdf726fd2e-config-volume\") pod \"collect-profiles-29415900-gtnx6\" (UID: \"798bc777-27bd-4c66-a700-a2fdf726fd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.680856 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/798bc777-27bd-4c66-a700-a2fdf726fd2e-secret-volume\") pod \"collect-profiles-29415900-gtnx6\" (UID: \"798bc777-27bd-4c66-a700-a2fdf726fd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.681242 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24ccn\" (UniqueName: \"kubernetes.io/projected/798bc777-27bd-4c66-a700-a2fdf726fd2e-kube-api-access-24ccn\") pod \"collect-profiles-29415900-gtnx6\" (UID: \"798bc777-27bd-4c66-a700-a2fdf726fd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:00 crc kubenswrapper[4778]: I1205 17:00:00.811486 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:01 crc kubenswrapper[4778]: I1205 17:00:01.249584 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 17:00:01 crc kubenswrapper[4778]: E1205 17:00:01.250073 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jqrsw_openshift-machine-config-operator(e780ff27-1d00-4280-8e7e-9eb9fe3dea6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" podUID="e780ff27-1d00-4280-8e7e-9eb9fe3dea6e" Dec 05 17:00:01 crc kubenswrapper[4778]: I1205 17:00:01.312782 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6"] Dec 05 17:00:02 crc kubenswrapper[4778]: I1205 17:00:02.231917 4778 generic.go:334] "Generic (PLEG): container finished" podID="798bc777-27bd-4c66-a700-a2fdf726fd2e" containerID="f13bd8630d5b9fe0511c4a5ad06d13b7363d22532cf361f1de8bb1a76af039d7" exitCode=0 Dec 05 17:00:02 crc kubenswrapper[4778]: I1205 17:00:02.232186 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" event={"ID":"798bc777-27bd-4c66-a700-a2fdf726fd2e","Type":"ContainerDied","Data":"f13bd8630d5b9fe0511c4a5ad06d13b7363d22532cf361f1de8bb1a76af039d7"} Dec 05 17:00:02 crc kubenswrapper[4778]: I1205 17:00:02.232298 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" event={"ID":"798bc777-27bd-4c66-a700-a2fdf726fd2e","Type":"ContainerStarted","Data":"4dca4b054dc239ae664291366921544c06ddc826b8948fd9cd509e834a38c1fd"} Dec 05 17:00:03 crc kubenswrapper[4778]: I1205 17:00:03.606522 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:03 crc kubenswrapper[4778]: I1205 17:00:03.676115 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24ccn\" (UniqueName: \"kubernetes.io/projected/798bc777-27bd-4c66-a700-a2fdf726fd2e-kube-api-access-24ccn\") pod \"798bc777-27bd-4c66-a700-a2fdf726fd2e\" (UID: \"798bc777-27bd-4c66-a700-a2fdf726fd2e\") " Dec 05 17:00:03 crc kubenswrapper[4778]: I1205 17:00:03.676208 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/798bc777-27bd-4c66-a700-a2fdf726fd2e-config-volume\") pod \"798bc777-27bd-4c66-a700-a2fdf726fd2e\" (UID: \"798bc777-27bd-4c66-a700-a2fdf726fd2e\") " Dec 05 17:00:03 crc kubenswrapper[4778]: I1205 17:00:03.676311 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/798bc777-27bd-4c66-a700-a2fdf726fd2e-secret-volume\") pod \"798bc777-27bd-4c66-a700-a2fdf726fd2e\" (UID: \"798bc777-27bd-4c66-a700-a2fdf726fd2e\") " Dec 05 17:00:03 crc kubenswrapper[4778]: I1205 17:00:03.677426 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798bc777-27bd-4c66-a700-a2fdf726fd2e-config-volume" (OuterVolumeSpecName: "config-volume") pod "798bc777-27bd-4c66-a700-a2fdf726fd2e" (UID: "798bc777-27bd-4c66-a700-a2fdf726fd2e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:00:03 crc kubenswrapper[4778]: I1205 17:00:03.686604 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798bc777-27bd-4c66-a700-a2fdf726fd2e-kube-api-access-24ccn" (OuterVolumeSpecName: "kube-api-access-24ccn") pod "798bc777-27bd-4c66-a700-a2fdf726fd2e" (UID: "798bc777-27bd-4c66-a700-a2fdf726fd2e"). InnerVolumeSpecName "kube-api-access-24ccn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:00:03 crc kubenswrapper[4778]: I1205 17:00:03.693506 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798bc777-27bd-4c66-a700-a2fdf726fd2e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "798bc777-27bd-4c66-a700-a2fdf726fd2e" (UID: "798bc777-27bd-4c66-a700-a2fdf726fd2e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:00:03 crc kubenswrapper[4778]: I1205 17:00:03.777918 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/798bc777-27bd-4c66-a700-a2fdf726fd2e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 17:00:03 crc kubenswrapper[4778]: I1205 17:00:03.777953 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24ccn\" (UniqueName: \"kubernetes.io/projected/798bc777-27bd-4c66-a700-a2fdf726fd2e-kube-api-access-24ccn\") on node \"crc\" DevicePath \"\"" Dec 05 17:00:03 crc kubenswrapper[4778]: I1205 17:00:03.777968 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/798bc777-27bd-4c66-a700-a2fdf726fd2e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 17:00:04 crc kubenswrapper[4778]: I1205 17:00:04.252232 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" event={"ID":"798bc777-27bd-4c66-a700-a2fdf726fd2e","Type":"ContainerDied","Data":"4dca4b054dc239ae664291366921544c06ddc826b8948fd9cd509e834a38c1fd"} Dec 05 17:00:04 crc kubenswrapper[4778]: I1205 17:00:04.252518 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dca4b054dc239ae664291366921544c06ddc826b8948fd9cd509e834a38c1fd" Dec 05 17:00:04 crc kubenswrapper[4778]: I1205 17:00:04.252277 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-gtnx6" Dec 05 17:00:04 crc kubenswrapper[4778]: I1205 17:00:04.710489 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn"] Dec 05 17:00:04 crc kubenswrapper[4778]: I1205 17:00:04.718441 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415855-lmgjn"] Dec 05 17:00:05 crc kubenswrapper[4778]: I1205 17:00:05.250496 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 17:00:05 crc kubenswrapper[4778]: E1205 17:00:05.250759 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 17:00:05 crc kubenswrapper[4778]: I1205 17:00:05.260136 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a44f580-21af-464b-a03e-fbd39614e1f9" path="/var/lib/kubelet/pods/4a44f580-21af-464b-a03e-fbd39614e1f9/volumes" Dec 05 17:00:16 crc kubenswrapper[4778]: I1205 17:00:16.249962 4778 scope.go:117] "RemoveContainer" containerID="39f018c1335ac2ee78576634abec4604fcf162caef7c87e715bb5c760b756450" Dec 05 17:00:17 crc kubenswrapper[4778]: I1205 17:00:17.363062 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jqrsw" event={"ID":"e780ff27-1d00-4280-8e7e-9eb9fe3dea6e","Type":"ContainerStarted","Data":"7e741fb1f15ebe338e6987918894f114528c0e294315f880b75b748af06d74e5"} Dec 05 17:00:18 crc kubenswrapper[4778]: I1205 17:00:18.249914 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 17:00:18 crc kubenswrapper[4778]: E1205 17:00:18.250733 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.152943 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gg644"] Dec 05 17:00:28 crc kubenswrapper[4778]: E1205 17:00:28.154332 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798bc777-27bd-4c66-a700-a2fdf726fd2e" containerName="collect-profiles" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.154363 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="798bc777-27bd-4c66-a700-a2fdf726fd2e" containerName="collect-profiles" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.154864 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="798bc777-27bd-4c66-a700-a2fdf726fd2e" containerName="collect-profiles" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.157825 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.174518 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gg644"] Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.257384 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e4cd93-3c94-4480-8000-23ed0d72e9bd-catalog-content\") pod \"certified-operators-gg644\" (UID: \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\") " pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.257476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e4cd93-3c94-4480-8000-23ed0d72e9bd-utilities\") pod \"certified-operators-gg644\" (UID: \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\") " pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.257495 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdxpn\" (UniqueName: \"kubernetes.io/projected/80e4cd93-3c94-4480-8000-23ed0d72e9bd-kube-api-access-qdxpn\") pod \"certified-operators-gg644\" (UID: \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\") " pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.359249 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e4cd93-3c94-4480-8000-23ed0d72e9bd-utilities\") pod \"certified-operators-gg644\" (UID: \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\") " pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.359299 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdxpn\" (UniqueName: \"kubernetes.io/projected/80e4cd93-3c94-4480-8000-23ed0d72e9bd-kube-api-access-qdxpn\") pod \"certified-operators-gg644\" (UID: \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\") " pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.359450 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e4cd93-3c94-4480-8000-23ed0d72e9bd-catalog-content\") pod \"certified-operators-gg644\" (UID: \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\") " pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.360083 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e4cd93-3c94-4480-8000-23ed0d72e9bd-catalog-content\") pod \"certified-operators-gg644\" (UID: \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\") " pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.360524 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e4cd93-3c94-4480-8000-23ed0d72e9bd-utilities\") pod \"certified-operators-gg644\" (UID: \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\") " pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.385457 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdxpn\" (UniqueName: \"kubernetes.io/projected/80e4cd93-3c94-4480-8000-23ed0d72e9bd-kube-api-access-qdxpn\") pod \"certified-operators-gg644\" (UID: \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\") " pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:28 crc kubenswrapper[4778]: I1205 17:00:28.481723 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:29 crc kubenswrapper[4778]: I1205 17:00:29.027675 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gg644"] Dec 05 17:00:29 crc kubenswrapper[4778]: I1205 17:00:29.462510 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gg644" event={"ID":"80e4cd93-3c94-4480-8000-23ed0d72e9bd","Type":"ContainerStarted","Data":"80332712b9e4977de90ab00e5b4a68f8757ebed379bfab050aed391ffba50865"} Dec 05 17:00:30 crc kubenswrapper[4778]: I1205 17:00:30.250156 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 17:00:30 crc kubenswrapper[4778]: E1205 17:00:30.250840 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 17:00:30 crc kubenswrapper[4778]: I1205 17:00:30.470620 4778 generic.go:334] "Generic (PLEG): container finished" podID="80e4cd93-3c94-4480-8000-23ed0d72e9bd" containerID="92acc75e2a4273d1969efa73376e065aac8e423cb2fdb937d1f7be137971a9fa" exitCode=0 Dec 05 17:00:30 crc kubenswrapper[4778]: I1205 17:00:30.470673 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gg644" event={"ID":"80e4cd93-3c94-4480-8000-23ed0d72e9bd","Type":"ContainerDied","Data":"92acc75e2a4273d1969efa73376e065aac8e423cb2fdb937d1f7be137971a9fa"} Dec 05 17:00:32 crc kubenswrapper[4778]: I1205 17:00:32.490419 4778 generic.go:334] "Generic (PLEG): container finished" podID="80e4cd93-3c94-4480-8000-23ed0d72e9bd" containerID="9ee63788dc4115dcb8f225aca604710da722ab42bd80a291f7b9975999ed318a" exitCode=0 Dec 05 17:00:32 crc kubenswrapper[4778]: I1205 17:00:32.490466 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gg644" event={"ID":"80e4cd93-3c94-4480-8000-23ed0d72e9bd","Type":"ContainerDied","Data":"9ee63788dc4115dcb8f225aca604710da722ab42bd80a291f7b9975999ed318a"} Dec 05 17:00:33 crc kubenswrapper[4778]: I1205 17:00:33.503514 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gg644" event={"ID":"80e4cd93-3c94-4480-8000-23ed0d72e9bd","Type":"ContainerStarted","Data":"6f3e6e44ce036dd596cdfb9286ea9fcfc83dded02adc464bb47f9a6e5f0e1992"} Dec 05 17:00:33 crc kubenswrapper[4778]: I1205 17:00:33.533734 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gg644" podStartSLOduration=2.839235023 podStartE2EDuration="5.533711395s" podCreationTimestamp="2025-12-05 17:00:28 +0000 UTC" firstStartedPulling="2025-12-05 17:00:30.472519852 +0000 UTC m=+3917.576316232" lastFinishedPulling="2025-12-05 17:00:33.166996224 +0000 UTC m=+3920.270792604" observedRunningTime="2025-12-05 17:00:33.525559593 +0000 UTC m=+3920.629355983" watchObservedRunningTime="2025-12-05 17:00:33.533711395 +0000 UTC m=+3920.637507785" Dec 05 17:00:38 crc kubenswrapper[4778]: I1205 17:00:38.482186 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:38 crc kubenswrapper[4778]: I1205 17:00:38.482670 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:38 crc kubenswrapper[4778]: I1205 17:00:38.543868 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:38 crc kubenswrapper[4778]: I1205 17:00:38.601474 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:41 crc kubenswrapper[4778]: I1205 17:00:41.249691 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 17:00:41 crc kubenswrapper[4778]: E1205 17:00:41.250137 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 17:00:42 crc kubenswrapper[4778]: I1205 17:00:42.131795 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gg644"] Dec 05 17:00:42 crc kubenswrapper[4778]: I1205 17:00:42.132296 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gg644" podUID="80e4cd93-3c94-4480-8000-23ed0d72e9bd" containerName="registry-server" containerID="cri-o://6f3e6e44ce036dd596cdfb9286ea9fcfc83dded02adc464bb47f9a6e5f0e1992" gracePeriod=2 Dec 05 17:00:43 crc kubenswrapper[4778]: I1205 17:00:43.583655 4778 generic.go:334] "Generic (PLEG): container finished" podID="80e4cd93-3c94-4480-8000-23ed0d72e9bd" containerID="6f3e6e44ce036dd596cdfb9286ea9fcfc83dded02adc464bb47f9a6e5f0e1992" exitCode=0 Dec 05 17:00:43 crc kubenswrapper[4778]: I1205 17:00:43.583945 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gg644" event={"ID":"80e4cd93-3c94-4480-8000-23ed0d72e9bd","Type":"ContainerDied","Data":"6f3e6e44ce036dd596cdfb9286ea9fcfc83dded02adc464bb47f9a6e5f0e1992"} Dec 05 17:00:43 crc kubenswrapper[4778]: I1205 17:00:43.673748 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:43 crc kubenswrapper[4778]: I1205 17:00:43.699739 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdxpn\" (UniqueName: \"kubernetes.io/projected/80e4cd93-3c94-4480-8000-23ed0d72e9bd-kube-api-access-qdxpn\") pod \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\" (UID: \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\") " Dec 05 17:00:43 crc kubenswrapper[4778]: I1205 17:00:43.699865 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e4cd93-3c94-4480-8000-23ed0d72e9bd-utilities\") pod \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\" (UID: \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\") " Dec 05 17:00:43 crc kubenswrapper[4778]: I1205 17:00:43.699894 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e4cd93-3c94-4480-8000-23ed0d72e9bd-catalog-content\") pod \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\" (UID: \"80e4cd93-3c94-4480-8000-23ed0d72e9bd\") " Dec 05 17:00:43 crc kubenswrapper[4778]: I1205 17:00:43.700891 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80e4cd93-3c94-4480-8000-23ed0d72e9bd-utilities" (OuterVolumeSpecName: "utilities") pod "80e4cd93-3c94-4480-8000-23ed0d72e9bd" (UID: "80e4cd93-3c94-4480-8000-23ed0d72e9bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:00:43 crc kubenswrapper[4778]: I1205 17:00:43.705386 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80e4cd93-3c94-4480-8000-23ed0d72e9bd-kube-api-access-qdxpn" (OuterVolumeSpecName: "kube-api-access-qdxpn") pod "80e4cd93-3c94-4480-8000-23ed0d72e9bd" (UID: "80e4cd93-3c94-4480-8000-23ed0d72e9bd"). InnerVolumeSpecName "kube-api-access-qdxpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:00:43 crc kubenswrapper[4778]: I1205 17:00:43.752486 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80e4cd93-3c94-4480-8000-23ed0d72e9bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80e4cd93-3c94-4480-8000-23ed0d72e9bd" (UID: "80e4cd93-3c94-4480-8000-23ed0d72e9bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:00:43 crc kubenswrapper[4778]: I1205 17:00:43.802067 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdxpn\" (UniqueName: \"kubernetes.io/projected/80e4cd93-3c94-4480-8000-23ed0d72e9bd-kube-api-access-qdxpn\") on node \"crc\" DevicePath \"\"" Dec 05 17:00:43 crc kubenswrapper[4778]: I1205 17:00:43.802932 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e4cd93-3c94-4480-8000-23ed0d72e9bd-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:00:43 crc kubenswrapper[4778]: I1205 17:00:43.802954 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e4cd93-3c94-4480-8000-23ed0d72e9bd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:00:44 crc kubenswrapper[4778]: I1205 17:00:44.592824 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gg644" event={"ID":"80e4cd93-3c94-4480-8000-23ed0d72e9bd","Type":"ContainerDied","Data":"80332712b9e4977de90ab00e5b4a68f8757ebed379bfab050aed391ffba50865"} Dec 05 17:00:44 crc kubenswrapper[4778]: I1205 17:00:44.593060 4778 scope.go:117] "RemoveContainer" containerID="6f3e6e44ce036dd596cdfb9286ea9fcfc83dded02adc464bb47f9a6e5f0e1992" Dec 05 17:00:44 crc kubenswrapper[4778]: I1205 17:00:44.593105 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gg644" Dec 05 17:00:44 crc kubenswrapper[4778]: I1205 17:00:44.614688 4778 scope.go:117] "RemoveContainer" containerID="9ee63788dc4115dcb8f225aca604710da722ab42bd80a291f7b9975999ed318a" Dec 05 17:00:44 crc kubenswrapper[4778]: I1205 17:00:44.636327 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gg644"] Dec 05 17:00:44 crc kubenswrapper[4778]: I1205 17:00:44.643681 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gg644"] Dec 05 17:00:44 crc kubenswrapper[4778]: I1205 17:00:44.650071 4778 scope.go:117] "RemoveContainer" containerID="92acc75e2a4273d1969efa73376e065aac8e423cb2fdb937d1f7be137971a9fa" Dec 05 17:00:45 crc kubenswrapper[4778]: I1205 17:00:45.260226 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80e4cd93-3c94-4480-8000-23ed0d72e9bd" path="/var/lib/kubelet/pods/80e4cd93-3c94-4480-8000-23ed0d72e9bd/volumes" Dec 05 17:00:50 crc kubenswrapper[4778]: I1205 17:00:50.090693 4778 scope.go:117] "RemoveContainer" containerID="8daa378cac1ab97776954f611d4f0378a1a81c33014925887a6950e56ce90baf" Dec 05 17:00:52 crc kubenswrapper[4778]: I1205 17:00:52.250013 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 17:00:52 crc kubenswrapper[4778]: E1205 17:00:52.250838 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.150604 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-cron-29415901-r9vvd"] Dec 05 17:01:00 crc kubenswrapper[4778]: E1205 17:01:00.151469 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e4cd93-3c94-4480-8000-23ed0d72e9bd" containerName="extract-utilities" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.151482 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e4cd93-3c94-4480-8000-23ed0d72e9bd" containerName="extract-utilities" Dec 05 17:01:00 crc kubenswrapper[4778]: E1205 17:01:00.151495 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e4cd93-3c94-4480-8000-23ed0d72e9bd" containerName="extract-content" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.151501 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e4cd93-3c94-4480-8000-23ed0d72e9bd" containerName="extract-content" Dec 05 17:01:00 crc kubenswrapper[4778]: E1205 17:01:00.151512 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e4cd93-3c94-4480-8000-23ed0d72e9bd" containerName="registry-server" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.151518 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e4cd93-3c94-4480-8000-23ed0d72e9bd" containerName="registry-server" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.151727 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="80e4cd93-3c94-4480-8000-23ed0d72e9bd" containerName="registry-server" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.152293 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.168782 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-cron-29415901-r9vvd"] Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.178446 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwp4n\" (UniqueName: \"kubernetes.io/projected/30e9ad2b-c067-43ae-9324-503042a65960-kube-api-access-zwp4n\") pod \"keystone-cron-29415901-r9vvd\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.178527 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-combined-ca-bundle\") pod \"keystone-cron-29415901-r9vvd\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.178580 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-fernet-keys\") pod \"keystone-cron-29415901-r9vvd\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.178614 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-config-data\") pod \"keystone-cron-29415901-r9vvd\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.280317 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwp4n\" (UniqueName: \"kubernetes.io/projected/30e9ad2b-c067-43ae-9324-503042a65960-kube-api-access-zwp4n\") pod \"keystone-cron-29415901-r9vvd\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.280404 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-combined-ca-bundle\") pod \"keystone-cron-29415901-r9vvd\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.280447 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-fernet-keys\") pod \"keystone-cron-29415901-r9vvd\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.280470 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-config-data\") pod \"keystone-cron-29415901-r9vvd\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.287398 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-config-data\") pod \"keystone-cron-29415901-r9vvd\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.290110 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-combined-ca-bundle\") pod \"keystone-cron-29415901-r9vvd\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.290290 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-fernet-keys\") pod \"keystone-cron-29415901-r9vvd\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.301765 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwp4n\" (UniqueName: \"kubernetes.io/projected/30e9ad2b-c067-43ae-9324-503042a65960-kube-api-access-zwp4n\") pod \"keystone-cron-29415901-r9vvd\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.468409 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:00 crc kubenswrapper[4778]: I1205 17:01:00.962238 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-cron-29415901-r9vvd"] Dec 05 17:01:01 crc kubenswrapper[4778]: I1205 17:01:01.747879 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" event={"ID":"30e9ad2b-c067-43ae-9324-503042a65960","Type":"ContainerStarted","Data":"e35828f63c76ba3c302822da35a58df055372e2cdf7dc86257c7cb201d361434"} Dec 05 17:01:01 crc kubenswrapper[4778]: I1205 17:01:01.748593 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" event={"ID":"30e9ad2b-c067-43ae-9324-503042a65960","Type":"ContainerStarted","Data":"05c8dff2b8536211ea20265ea8459c056337e83643d4e486734590bd8d801b87"} Dec 05 17:01:01 crc kubenswrapper[4778]: I1205 17:01:01.764945 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" podStartSLOduration=1.764923813 podStartE2EDuration="1.764923813s" podCreationTimestamp="2025-12-05 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:01:01.761340336 +0000 UTC m=+3948.865136756" watchObservedRunningTime="2025-12-05 17:01:01.764923813 +0000 UTC m=+3948.868720203" Dec 05 17:01:03 crc kubenswrapper[4778]: I1205 17:01:03.767141 4778 generic.go:334] "Generic (PLEG): container finished" podID="30e9ad2b-c067-43ae-9324-503042a65960" containerID="e35828f63c76ba3c302822da35a58df055372e2cdf7dc86257c7cb201d361434" exitCode=0 Dec 05 17:01:03 crc kubenswrapper[4778]: I1205 17:01:03.767188 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" event={"ID":"30e9ad2b-c067-43ae-9324-503042a65960","Type":"ContainerDied","Data":"e35828f63c76ba3c302822da35a58df055372e2cdf7dc86257c7cb201d361434"} Dec 05 17:01:04 crc kubenswrapper[4778]: I1205 17:01:04.249481 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 17:01:04 crc kubenswrapper[4778]: E1205 17:01:04.249746 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b" Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.115576 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.260924 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwp4n\" (UniqueName: \"kubernetes.io/projected/30e9ad2b-c067-43ae-9324-503042a65960-kube-api-access-zwp4n\") pod \"30e9ad2b-c067-43ae-9324-503042a65960\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.260990 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-config-data\") pod \"30e9ad2b-c067-43ae-9324-503042a65960\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.261012 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-fernet-keys\") pod \"30e9ad2b-c067-43ae-9324-503042a65960\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.261127 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-combined-ca-bundle\") pod \"30e9ad2b-c067-43ae-9324-503042a65960\" (UID: \"30e9ad2b-c067-43ae-9324-503042a65960\") " Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.266891 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e9ad2b-c067-43ae-9324-503042a65960-kube-api-access-zwp4n" (OuterVolumeSpecName: "kube-api-access-zwp4n") pod "30e9ad2b-c067-43ae-9324-503042a65960" (UID: "30e9ad2b-c067-43ae-9324-503042a65960"). InnerVolumeSpecName "kube-api-access-zwp4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.277561 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "30e9ad2b-c067-43ae-9324-503042a65960" (UID: "30e9ad2b-c067-43ae-9324-503042a65960"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.285002 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30e9ad2b-c067-43ae-9324-503042a65960" (UID: "30e9ad2b-c067-43ae-9324-503042a65960"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.319217 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-config-data" (OuterVolumeSpecName: "config-data") pod "30e9ad2b-c067-43ae-9324-503042a65960" (UID: "30e9ad2b-c067-43ae-9324-503042a65960"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.364395 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwp4n\" (UniqueName: \"kubernetes.io/projected/30e9ad2b-c067-43ae-9324-503042a65960-kube-api-access-zwp4n\") on node \"crc\" DevicePath \"\"" Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.364436 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.364445 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.364453 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e9ad2b-c067-43ae-9324-503042a65960-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.786824 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" event={"ID":"30e9ad2b-c067-43ae-9324-503042a65960","Type":"ContainerDied","Data":"05c8dff2b8536211ea20265ea8459c056337e83643d4e486734590bd8d801b87"} Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.787119 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05c8dff2b8536211ea20265ea8459c056337e83643d4e486734590bd8d801b87" Dec 05 17:01:05 crc kubenswrapper[4778]: I1205 17:01:05.786885 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-cron-29415901-r9vvd" Dec 05 17:01:15 crc kubenswrapper[4778]: I1205 17:01:15.249237 4778 scope.go:117] "RemoveContainer" containerID="8f4dc4c007bbef5e23e8afe286017bd4d8ba7e42cf34a03d3af41a13bea5cbc7" Dec 05 17:01:15 crc kubenswrapper[4778]: E1205 17:01:15.250023 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=watcher-decision-engine pod=watcher-kuttl-decision-engine-0_watcher-kuttl-default(eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b)\"" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="eaaac528-e568-4bd8-a7e9-eebdcbdc4b7b"